Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 302

In-class resources for Thu Sep 17 (class 1)

Getting Started
Review course description, objectives and expectations
Distribute classroom leadership assignment schedule and reading binder
Distribute iPads and begin setup
1.

Receive iPad, case and charger.


Record the serial number of your assigned iPad (we will do this in class).
Verify NYC DOE is working.

2.

Sign-in with your Apple ID or Create a new Apple ID.


Be sure and write down/keep track of your Apple ID email and password.

3.

Settings > iCloud > Find My iPad toggle to the ON (green) position

4.

Log on to Canvas. Open the Canvas app. Type: Free Canvas Accounts, this will take you to
canvas.instructure.com. Log on using your email and password. (If you do not already have a Canvas
email/password from another BHSEC class, follow the create new account instructions in your email.)

5.

Create a Zaption account for use in this class.


Open the Zaption app on your iPad and then use your email and password to create this account.

6.

Create a new Twitter account for use in this class.


Be sure and write down/keep track of your new Twitter account name.

7.

Follow our class account on Twitter: @bhsecinternet

8.

Create a new Medium account for use in this class.


Use the Sign in with Twitter button so that your Medium and Twitter accounts are linked. Note that setting up
your Medium account includes verification via email.
Be sure and write down/keep track of your new Medium user name.

9.

Follow me on Medium: Robert Greenberg.


Note: to follow me, log into Medium and click on the Magnifying Glass icon at the top right of the page. Then
search for my name: Robert Greenberg and the button Follow).

10. Log in to our class Vimeo account:


Email: bhsecbooks@gmail.com
Password: cutandpaste
11. Log in to our class Google Play Books account:
Email: bhsecbooks@gmail.com

Password: cutandpaste
Current Events
Video
We will watch and discuss this in class.
Internet Rising (53:03-watch first 3:12)
Look for: David Weinberger, Douglas Rushkoff, Andrew Keen, Howard Rheingold, Kevin Kelly
http://internetrising.net
Reading
We will read and discuss this in class.
Time for a Pause
(Thomas Friedman, NYT op-ed, 2015)
Preview Sunday Story 1 (due Sun Sep 20) and Homework for Class 2 (Tue
Sep 22)
Assign classroom leaders for Class 2 (Tue Sep 22)

Course Description
Each semester, we gather to consider the internets profound impact on
contemporary society and its wide range of effects, from impacts on privacy and
intellectual property to technologys evolving effect on socialization and the
changing neuro-circuitry of our brains. Within our learning communities we give
critical and creative thought to the cultural, economic, political and social
implications of evolving internet technologies, borrowing from our everyday
experiences and making connections to ideas and arguments presented by external
audiences. For each class, we will examine ideas and arguments excerpted from
books, blogs, news articles and scholarly journals and will demonstrate active
engagement with this material by annotating this text with marginalia. We will also
explore a wide range of video material, including the movies Her and Citizenfour. We
will share ruminations via classroom seminar, online discussion, Twitter, original
video (using iMovie and Vimeo) and Medium. My goal is to be a facilitator through
texts that we examine together. Your contributions should be thoughtful, on topic,
and respectful of others. You will do well to remember that this is not my class, but
ours, and its success is dependent upon the contributions of every class member.

Course Objectives
Consider the internets impact on society and our daily lives
Sharpen skills in using the internet as a tool for inquiry (reading, listening and watching)
Practice higher order thinking in writing, discussion and media production
Develop a personal plan for living in the digital age with focus and integrity
Expected Time Commitment
As a general guideline, you should invest at least two hours of outside work for every
class meeting. If you have questions as you prepare for class, I am invariably available
via email:
rgreenberg@bhsec.bard.edu

Grading
Connections among course texts, your own experience and the external world
Interactive participation and leadership within our learning community
Innovation and originality
Thoughtful and honest self-assessment

Tuesday and Thursday classes


Homework reading (take notes using pen and/or highlighter, bring reading to class)
Homework video (compliance measured via Zaption)
Select a lesson link and Tweet the article or videos gist or guiding question
Participate in online discussion (Canvas or alternative platform)
Participate in class discussion
Classroom Leadership Assignments (rotating schedule)

Sunday Stories
Share a creative and course-related reflection via Vimeo or Medium.
Timeliness and Academic Integrity

You will be penalized for submitting late work and/or unexcused class absence.
Plagiarism or other acts that violate the integrity of our learning community will also
incur consequences.

Rubric #1: Online and Class Participation


100 points per week (12 weeks)
Completion and timeliness
(40 points)

You will earn 40 points each week for on-time completion of work and class attendance.
These points are yours to lose. Five points will be subtracted for any of the following:
Not completing the online discussion assignment before class
Not completing the lesson link assignment before class
Evidence of not completing the homework reading and video
(e.g., no login on Zaption, not marking up the reading and/or failing a reading quiz)
Not bringing your reading to class
Cutting class
Contribution to online discussion
(10 points/discussion, 2 discussions/week)
There will be an online discussion assignment before each class. We will begin this online
discussion
work on Canvas and over the course of the semester will rotate the the work to other platforms
including Vimeo and Medium.
Completion

Good Work

Exemplary Work

Fewer than 100 words posted

Early post allows others to


comment
Creating conversation
Engaging others
Higher order thinking
Connections to course texts
Outside research and/or links
Overall effort
8

Multiple paragraphs

Writing in a silo
No specific effort to engage
Shooting from the hip
No connections to course texts
No evidence of outside research
No interaction with others
6

Evidence of listening to others


Thoughtful formatting and/or devices
Analysis, synthesis and/or critique
Thoughtfully uses facts/expert opinion
Links to new articles and/or videos
Multiple interactions with others
10

Contribution to class discussion


(10 points/class, 2 classes/week)
Completion

Good Work
Overall effort

Exemplary Work
Multiple contributions

Does not raise hand to talk in class


Lethargic body language
Disrupts classroom discussion

Higher order thinking


Stays on topic
Connections to course texts
Outside research and/or links

Evidence of synthesis and/or critique


Directly responds to the question
Connections to course texts
Links to new articles and/or videos

Helping others

Helps others to contribute

10

Completion

Exemplary Work

Completion

Evidence of synthesis or critique


Originality / Innovation
Interaction with others in the class

10

Tweeting the gist of a lesson link


(10 points/Tweet, 2 posts/week)

Rubric #2: Classroom leadership assignments


100 points (3x/semester)
Each student will serve classroom leader three times during the semester:
1. Homework reading/video discussion leader
2. Digerati video/reading discussion leader
3. Current events/iPad discussion leader
Preparation
(up to 50 points)
1. Review the homework reading and video material.
2. Select excerpt(s) for class discussion
3. Write guiding questions and lesson plan to prompt discussion (e.g., demonstrate close
reading)
4. Email me your draft work
5. Receive feedback via email
6. Integrate my feedback
Classroom Leadership
(up to 50 points)
1.
2.
3.
4.

Make a coherent and compelling introduction


Offer one or several guiding questions to prompt discussion /debate
Lead discussion, call hands
Keep discussion moving forward

Exemplary work will include evidence of:


Above and Beyond effort
Creativity / Innovation / Risk-taking

Rubric #3: Sunday stories


100 points per week (12 weeks)

Purpose:
To develop your ability to communicate using recorded sounds and images
To deepen your personal understanding and synthesis of a course theme
To engage and entertain your peers in a way that enhances learning
Completion
(40 points)
Technical (effort and complexity)
(up to 20 points)
Entertainment value
(up to 20 points)
Evidence of higher order thinking
(up to 20 points)

Rubric

Completion

Technical
(effort and
complexity)

12

16

20

One-dimensional

Overall complexity

No use of still images

images

Captured video from the web


(video)
One-dimensional (video)

Moving images

Multi-dimensional
(combines a variety of modalities)
Complex and thoughtful
use of still images
Original video with script and actors

No graphics

Graphics and Effects

Hard to hear and/or see

Clarity and Legibility

Entertainmen
t value

Evidence of
higher order
thinking
Overall

Good Work

Sound

Exemplary Work

Multi-dimensional
(e.g., live action, voiceover, music)
Thoughful and innovative use of
formatting and graphics tools
Easy to hear and see

12

16

20

Does not capture the attention of peers


One-dimensional and/or unoriginal

Energy and Engagement


Creativity

Captures interest and respect of peers


Goes outside the box,
evidence of innovation

12

16

20

Not really
Regurgitation of course texts

Demonstration of thoughtful
engagement with course themes
Higher Order Thinking

Connecting to ideas discussed in the


course and extending beyond
Analysis, Synthesis, Critique

36

48

60

Earnback (optional)
Earn back up to half of all missed points based on feedback
Submit new draft of your work within 7 days of receiving feedback

http://nyti.ms/1xCQUuL

The Opinion Pages

OP-ED COL UMNIST

Time for a Pause


JAN. 6, 2015

Thomas L. Friedman

You could easily write a book, or, better yet, make a movie about the drama that
engulfed Sony Pictures and The Interview, Sonys own movie about the
fictionalized assassination of North Koreas real-life dictator. The whole saga
reflects so many of the changes that are roiling and reshaping todays world before
weve learned to adjust to them.
Think about this: In November 2013, hackers stole 40 million credit and debit
card numbers from Targets point-of-sale systems. Beginning in late August 2014,
nude photos believed to have been stored by celebrities on Apples iCloud were
spilled onto the sidewalk. Thanksgiving brought us the Sony hack, when, as The
Times reported: Everything and anything had been taken. Contracts. Salary lists.
Film budgets. Medical records. Social Security numbers. Personal emails. Five
entire movies. And, on Christmas, gaming networks for both the Sony PlayStation
and the Microsoft Xbox were shut down by hackers. But rising cybercrime is only
part of the story. Every day a public figure is apologizing for something crazy or
foul that he or she muttered, uttered, tweeted or shouted that went viral
including the rantings of an N.B.A. owner in his girlfriends living room.
Whats going on? Were in the midst of a Gutenberg-scale change in how
information is generated, stored, shared, protected and turned into products and
services. We are seeing individuals become superempowered to challenge
governments and corporations. And we are seeing the rise of apps that are putting
strangers into intimate proximity in each others homes (think Airbnb) and into

each others cars (think Uber) and into each others heads (think Facebook, Twitter
and Instagram). Thanks to the integration of networks, smartphones, banks and
markets, the world has never been more tightly wired. As they say: Lost there, felt

his fiance; and private photos of movie stars. They all have different moral and
societal significance. We need to deal with them differently.
We need to pause more to make sense of all the M.R.I.s were being exposed
to, argued Seidman. In the pause, we reflect and imagine a better way. In some
cases, that could mean showing empathy for the fact that humans are imperfect. In
others, it could mean taking principled stands toward those whose behaviors
make this interdependent world unsafe, unstable or unfree.
In short, theres never been a time when we need more people living by the
Golden Rule: Do unto others as you would have them do unto you. Because, in
todays world, more people can see into you and do unto you than ever before.
Otherwise, were going to end up with a gotcha society, lurching from outrage to
outrage, where in order to survive youll either have to disconnect or constantly
censor yourself because every careless act or utterance could ruin your life. Who
wants to live that way?
(For 2015, I will just be writing on Wednesdays while I work on a book.)
A version of this op-ed appears in print on January 7, 2015, on page A23 of the New York edition with the
headline: Time for a Pause.

2015 The New York Times Company

Sunday Story #1
Due Sun Sep 20 (before midnight)

Assignment: Write your first Medium story


Is Silicon Valley Saving the World or Just Making Money?
Periodically, The New York Times invites outside contributors to discuss news events in a feature titled:
Room for Debate. On June 22, the topic was: Is Silicon Valley Saving the World or Just Making Money?.
Here is the introduction the NYT editors offered for this discussion:
Silicon Valley has a greater capacity to change the world than the kings and presidents of even a hundred years ago,
one wealthy Facebook beneficiary supposedly told a tech conference last year. Many tech luminaries think they are
doing Gods work. But are the innovations coming out of the Bay area really creating a new and better world, or just
making lots of money for a few people?
Take a look at the ideas and arguments presented by the six authors (I have shortened the original URL):
http://goo.gl/ntVQ1l and then join in the debate, expressing your thoughts using your new Medium account.

Your Medium story should:


be at least 200 and no more than 400 words
include at least one photo, video and/or Tweet (please be thoughtful and deliberate in your use of media!)
Technical notes:
Before starting this work, please complete the iPad setup steps detailed in the in-class #1 handout (Thu Sep 17). Note
that you will need your new Twitter and Medium accounts for this assignment.
Start by clicking on the Write a story button at the top of the page.
Note: clicking on the round Write a story button is NOT the same as clicking on the words Write here.
You want: Write a story
When you click on the words Write here to get started, you will see a plus-sign in a circle to the left.
Click on the plus-sign and Medium will reveal the following tools:
a. camera button (photo)
b. play button (movie)
c. < > embed button (for Twitter or equivalent)
For this assignment, your Medium story must include at least one photo, video and/or Tweet.
IMPORTANT:
Once you have completed your reading journal responsebe sure to click on Publish and then Publish Story

Homework for Tue Sep 22 (class 2)


Everything Bad is Good For You
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Clive Thompson: essays from WIRED magazine
Video
Watch the assigned video on Zaption and post a one-sentence response.
One World / Passage (9:25)
http://zapt.io/tucc2aa9
Online discussion
Respond to this prompt on Canvas:
If you were forced to choose between giving up one of your fingers and
giving up the use of your computer (and phone, etc.) for the rest of your
life, which would you choose?
https://canvas.instructure.com/courses/936923/discussion_topics/4017331

In-class resources for Tue Sep 22 (class 2)


Everything Bad is Good For You
Digerati video and reading
(discussion leader: x)
Digerati
Steven Johnson
@stevenbjohnson
https://en.wikipedia.org/wiki/Steven_Johnson_(author)
Video
We will watch and discuss this in class.
Steven Johnson on Colbert Report (6:09)
http://thecolbertreport.cc.com/videos/l9x3is/steve-johnson
Reading
We will read and discuss this in class.
Everything Bad is Good For You: How todays popular culture is actually making us smarter
(Steven Johnson, 2005)
Future Perfect: The case for progress in a networked age
(Steven Johnson, 2013)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)

Everything bad is good for you


Mark Bauerlein, The Dumbest Generation (2008)
When Steven Johnson published Everything Bad Is Good for You: How Todays Popular
Culture Is Actually Making Us Smarter in early 2005, it sparked across-the-country
newspaper reviews, radio spots, and television appearances, including one on The Daily
Show. The appeal was overt, and as Johnson observes on his Web site, The title says it
all.. In the last 30 years, Johnson insists, popular culture has become more
complex and intellectually challenging . . . demanding more cognitive
engagement. Often the content of popular culture is coarse and inane, he
conceded, but the formal elements rules, plotlines, feedback, levels,
interactivityhave grown more sophisticated, making todays games and reality
shows into a kind of cognitive workout that hones mental gifts. Screen diversions
inculcate intellectual or cognitive virtues, aptitudes for spatialization, pattern recognition,
and problem solving, virtues that reflect twenty-first-century demands better, in fact, than
do traditional knowledge and reading skills.
Everything bad is good for you (review)
Clampe, slashdot.org (2005)
The Internet is valuable in three ways according to Johnson: by virtue of being participatory,
by forcing users to learn new interfaces and by creating new channels for social interaction.
Everything bad is good for you (review)
Farhad Manjoo (2005)
The other day, in an unbelievably delicious turn of events, the government reported that people
who are overweight face a lower risk of death than folks who are thin. While the news didn't
exactly exonerate junk food, it was a fitting prelude to the publication of Steven Johnson's new
polemic Everything Bad Is Good for You, which argues that what we think of as junk food for the
mind -- video games, TV shows, movies and much of what one finds online -- is not actually junk
at all. In this intriguing volume, Johnson marshals the findings of brain scientists and
psychologists to examine the culture in which we swim, and he shows that contrary to what
many of us assume, mass media is becoming more sophisticated all the time. As I see it, then,
the most interesting question about Johnson's theory is not whether it's accurate. It's why it's
happening -- why is media getting smarter, and why are we flocking to media that actually
makes us smarter? Johnson examines the question at some length, and he fingers two usual
suspects: technology (the VCR, TiVo, DVDs, ever more powerful game systems) and economics
(the increasing importance of the syndication market). But I like the third part of his answer best
-- our media's getting smarter, he says, because the brain craves intelligent programming.
Everything bad is good for you: How todays popular culture is actually making us
smarter
Steven Johnson (2005)
Reading books chronically understimulates the senses. Unlike thelong-standing tradition of game
playing which engages the child in a vivid,three-dimensional world filled with moving images
and musical soundscapes,navigated and controlled with complex muscular movements books
are simply abarren string of words on the page. Books are also tragically isolating. While games
have for many yearsengaged the young in complex social relationships with their peers, building
andexploring worlds together, books force the child to sequester him- or herself ina quiet space,
shut off from interaction with other children. These newlibraries that have arisen in recent years
to facilitate reading activitiesare a frightening sight: dozens of young children, normally so
vivacious andsocially interactive, sitting alone in cubicles, reading silently, oblivious totheir
peers. But perhaps the most dangerous property of these books is the fact thatthey follow a

fixed linear path. You cant control their narratives in anyfashion you simply sit back and have
the story dictated to you. This risksinstilling a general passivity in our children, making them feel
as thoughtheyre powerless to change their circumstances. Reading is not an active,participatory
process; its a submissive one. The book readers of the youngergeneration are learning to follow
the plot instead of learning to lead.

Future Perfect: The case for progress in a networked age (review)


John Horgan, Wall Street Journal (2013)

If you're a pessimistand chances are you areyou should read "Future Perfect" by the
technophilic science writer Steven Johnson. In fact, read it even if you're an optimist, because Mr.
Johnson's book will give you lots of material to brighten the outlook of your gloomy friends.
Mr. Johnson notes that, contrary to popular perception, humanity has achieved enormous
progress over the past century. Life spans have almost doubled since 1900, and over the past
half-century the percentage of humanity living in extreme poverty has been cut in half. In the
U.S. over the past 20 years, crime, traffic fatalities, air pollution and infant mortality have
dropped. Yes, Mr. Johnson concedes, we still face daunting problems, but we will surely solve
them, given how far we have come.
Mr. Johnson traces much of our progress to what he calls "peer networks." Conventional
organizations, he says, whether corporations or governments, tend to be hierarchical and
centralized, with information tightly controlled by the people in charge. Peer networks, by
contrast, consist of individuals of roughly equal status achieving goals by sharing, criticizing and
revising information and ideas.
The Internet is both the product of peer networks and an astonishingly effective enabler of them.
Peer networks helped to spawn Wikipedia and launch a lot of other things as well: crowd-sourced
fundraisers such as Kickstarter; the Arab Spring and Occupy Wall Street protests; and the 311
systems of New York and other cities, by means of which citizens alert authorities to potholes,
noisy bars and other problems.
Future Perfect: The case for progress in a networked age (review)
Steven Johnson (2013)
So what does the Internet want? It wants to lower the cost for creating and sharing information.
The notion sounds unimpeachable when you phrase it like that, until you realize all the strange
places that kind of affordance ultimately leads to. The Internet wants to breed algorithms that
can execute thousands of financial transactions per minute, and it wants to disseminate the
#occupywallstreet meme across the planet. The Internet wants both the Wall Street tycoons
and the popular insurrection at its feet.
Can that strange, contradictory cocktail drive progress on its own? Perhaps for the simple
reason that it democratizes the control of information. When information is expensive and
scarce, powerful or wealthy individuals or groups have a disproportionate impact on how that
information circulates. But as it gets cheaper and more abundant, the barriers to entry are
lowered. This is hardly a new observation, but everything that has happened over the last twenty
years has confirmed the basic insight. That democratization has not always led to positive
outcomes think of those spam artists but there is no contesting the tremendous, orders-ofmagnitude increase in the number of people creating and sharing, thanks to the mass adoption
of the Internet.
The peer progressives faith in the positive effects of the Internet rests on this democratic
principle: When you give people more control over the flow of information and decision making
in their communities, their social health improves incrementally, in fits and starts, but also
inexorably. Yes, when you push the intelligence out to the edges of the network, sometimes
individuals or groups abuse those newfound privileges; a world without gatekeepers or planners
is noisier and more chaotic. But the same is true of other institutions that have stood the test of
time. Democracies on occasion elect charlatans or bigots or imbeciles; markets on occasion
erupt in catastrophic bubbles, or choose to direct resources to trivial problems while ignoring the
more pressing ones. We accept these imperfections because the alternatives are so much worse.
The same is true of the Internet and the peer networks it has inspired. They are not perfect, far
from it. But over the long haul, they produce better results than the Legrand Stars that came
before them. Theyre not utopias. Theyre just leaning that way.

Sunday Story #2
Due Sun Sep 27 (before midnight)
Prompt

vimeo-generation video for Tuesday

Homework for Tue Sep 29 (class 3)


Living in the Digital World
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
The New Meaning of Mobility
(Christine Rosen, The New Atlantis, 2011)
Michael Wesch Looks Back
(Michael Wesch, 2008)
Video
Watch the assigned video on Zaption and post a one-sentence response.
Michael Wesch: A Vision of Students Today (4:44)
http://zapt.io/tmqqpymh
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Share one or several comments in response to "Sunday Story 2"
videos on Vimeo

Michael Wesch Looks Back


Michael Wesch, mediaculture.org (2008)
In spring 2007 I invited the 200 students enrolled in the small version of my Introduction
to Cultural Anthropology class to tell the world what they think of their education by helping
me write a script for a video to be posted on YouTube. The result was the disheartening
portrayal of disengagement you see below. The video was viewed over one million times in
its first month and was the most blogged about video in the blogosphere for several weeks,
eliciting thousands of comments. With rare exception, educators around the world expressed
the sad sense of profound identification with the scene, sparking a wide-ranging debate
about the roles and responsibilities of teachers, students, and technology in the classroom.
Despite my role in the production of the video, and the thousands of comments supporting
it, I recently came to view the video with a sense of uneasiness and even incredulity. Surely
it cant be as bad as the video seems to suggest, I thought. I started wrestling with these
doubts over the summer as I fondly recalled the powerful learning experiences I had shared
with my students the previous year. By the end of the summer I had become convinced that
the video was over the top, that things were really not so bad, that the system is not as
broken as I thought, and we should all just stop worrying and get on with our teaching. But
when I walked into my classroom for the first day of school two weeks ago I was immediately
reminded of the real problem now facing education. The problem is not just written on the
walls. Its built into them.
I arrived early, finding 493 empty numbered chairs sitting mindlessly fixated on the front of
the room. A 600 square foot screen stared back at them. Hundreds of students would soon
fill the chairs, but the carefully designed sound-absorbing walls and ceiling, along with state
of the art embedded speakers, ensured that there would only be one person in this room to
be heard. That person would be me, pacing around somewhere near stage-left, ducking
intermittently behind a small podium housing a computer with a wireless gyromouse that
will grant me control of some 786,432 points of light on that massive screen.
The room is nothing less than a state of the art information dump, a physical manifestation
of the all too pervasive yet narrow and nave assumption that to learn is simply to acquire
information, built for teachers to effectively carry out the relatively simple task of conveying
information. Its sheer size, layout, and technology are testaments to the efficiency and
expediency with which we can now provide students with their required credit hours.
My class is popular. We only enroll 400 so there should have been plenty of seats but on the
first day all seats were filled and it was standing room only in the back. The room was
buzzing with energy as friends reconnected after the long summer.
I started talking and an almost deafening silence greeted my first words. I have always been
amazed and intimidated by this silence. It seems to so tenuously await my next words. The
silence is immediately filled with the more subtle yet powerful messages sent by 500 sets of
eyes which I continuously scan, listening to what they have to say as I talk. In an instant
those eyes can turn from wonder and excitement to the disheartening glaze of universal and
irreversible disengagement. Perpetually dreading this glaze I nervously pace as I talk and
use grandiose gestures. At times I feel desperate for their attention. I rush to amuse them
with jokes and stories as I swing, twist, and swirl that gyromouse, directing the 786,432
pixels dancing points of light behind me, hoping to dazzle them with a multi-media
extravaganza.
Somehow I seem to hold their attention for the full hour. I marvel at what a remarkable
achievement it is to bring hundreds of otherwise expressive, exuberant, and often rebellious
youths into a single room and have them sit quietly in straight rows while they listen to the
authority with the microphone. Such an achievement could not be won by an eager teacher
armed with technology alone. It has taken years of acclimatizing our youth to stale artificial
environments, piles of propaganda convincing them that what goes on inside these

environments is of immense importance, and a steady hand of discipline should they ever
start to question it. Alfred North Whitehead called it soul murder.
The getting by game.

Reports from my teaching assistants sitting in the back of the room tell a different story.
Apparently, several students standing in the back cranked up their iPods as I started to
lecture and never turned them off, sometimes even breaking out into dance. My lecture
could barely be heard nearby as the sound-absorbing panels and state of the art speakers
were apparently no match for those blaring iPods. Scanning the room my assistants also saw
students cruising Facebook, instant messaging, and texting their friends. The students were
undoubtedly engaged, just not with me.
My teaching assistants consoled me by noting that students have learned that they can get
by without paying attention in their classes. Perhaps feeling a bit encouraged by my look of
incredulity, my TAs continued with a long list of other activities students have learned that
they can get by without doing. Studying, taking notes, reading the textbook, and coming
to class topped the list. It wasnt the list that impressed me. It was the unquestioned
assumption that getting by is the name of the game. Our students are so alienated by
education that they are trying to sneak right past it.
If you think this little game is unfair to those students who have been duped into playing,
consider those who have somehow managed to maintain their inherent desire to learn. One
of the most thoughtful and engaged students I have ever met recently confronted a
professor about the nuances of some questions on a multiple choice exam. The professor
politely explained to the student that he was overthinking the questions. What kind of
environment is this in which overthinking is a problem? Apparently he would have been
better off just playing along with the getting by game.
Last spring I asked my students how many of them did not like school. Over half of them
rose their hands. When I asked how many of them did not like learning, no hands were
raised. I have tried this with faculty and get similar results. Last years U.S. Professor of the
Year, Chris Sorensen, began his acceptance speech by announcing, I hate school. The
crowd, made up largely of other outstanding faculty, overwhelmingly agreed. And yet he
went on to speak with passionate conviction about his love of learning and the desire to
spread that love. And theres the rub. We love learning. We hate school. Whats worse is that
many of us hate school because we love learning.
What went wrong?
How did institutions designed for learning become so widely hated by people who love
learning?
The video seemed to represent what so many were already feeling, and it became the focal
point for many theories. While some simply blamed the problems on the students
themselves, others recognized a broader pattern. Most blamed technology, though for very
different reasons. Some simply suggested that new technologies are too distracting and
superficial and that they should be banned from the classroom. Others suggested that
students are now wired differently. Created in the image of these technologies, luddites
imagine students to be distracted and superficial while techno-optimists see a new
generation of hyper-thinkers bored with old school ways.
But the problems are not new. They are the same as those identified by Neil Postman and
Charles Weingartner nearly 40 years ago when they described the plight of totally alienated
students involved in a cheating scandal (a true art form in the getting by game) and
asked, What kind of vicious game is being played here, and who are the sinners and who
the sinned against? (1969:51).
Texting, web-surfing, and iPods are just new versions of passing notes in class, reading
novels under the desk, and surreptitiously listening to Walkmans. They are not the problem.

They are just the new forms in which we see it. Fortunately, they allow us to see the problem
in a new way, and more clearly than ever, if we are willing to pay attention to what they are
really saying.
They tell us, first of all, that despite appearances, our classrooms have been fundamentally
changed. There is literally something in the air, and it is nothing less than the digital
artifacts of over one billion people and computers networked together collectively producing
over 2,000 gigabytes of new information per second. While most of our classrooms were
built under the assumption that information is scarce and hard to find, nearly the entire body
of human knowledge now flows through and around these rooms in one form or another,
ready to be accessed by laptops, cellphones, and iPods. Classrooms built to re-enforce the
top-down authoritative knowledge of the teacher are now enveloped by a cloud of
ubiquitous digital information where knowledge is made, not found, and authority is
continuously negotiated through discussion and participation. In short, they tell us that our
walls no longer mark the boundaries of our classrooms.
And thats what has been wrong all along. Some time ago we started taking our walls too
seriously not just the walls of our classrooms, but also the metaphorical walls that we have
constructed around our subjects, disciplines, and courses. McLuhans statement about
the bewildered child confronting the education establishment where information is scarce
but ordered and structured by fragmented, classified patterns, subjects, and schedules still
holds true in most classrooms today. The walls have become so prominent that they are
even reflected in our language, so that today there is something called the real world
which is foreign and set apart from our schools. When somebody asks a question that seems
irrelevant to this real world, we say that it is merely academic.
Not surprisingly, our students struggle to find meaning and significance inside these walls.
They tune out of class, and log on to Facebook.
The solution.
Fortunately, the solution is simple. We dont have to tear the walls down. We just have to
stop pretending that the walls separate us from the world, and begin working with students
in the pursuit of answers to real and relevant questions.
When we do that we can stop denying the fact that we are enveloped in a cloud of
ubiquitous digital information where the nature and dynamics of knowledge have shifted. We
can acknowledge that most of our students have powerful devices on them that give them
instant and constant access to this cloud (including almost any answer to almost any
multiple choice question you can imagine). We can welcome laptops, cell phones, and iPods
into our classrooms, not as distractions, but as powerful learning technologies. We can use
them in ways that empower and engage students in real world problems and activities,
leveraging the enormous potentials of the digital media environment that now surrounds us.
In the process, we allow students to develop much-needed skills in navigating and
harnessing this new media environment, including the wisdom to know when to turn it off.
When students are engaged in projects that are meaningful and important to them, and that
make them feel meaningful and important, they will enthusiastically turn off their cellphones
and laptops to grapple with the most difficult texts and take on the most rigorous tasks.

In-class resources for Tue Sep 29 (class 3)


The world according to Christine Rosen
Digerati video and reading
(discussion leader: x)
Digerati
Christine Rosen
BHSEC Q guest author (Spring 2012)
http://www.thenewatlantis.com/authors/christine-rosen
Video
We will watch and discuss this in class.
Christine Rosen on WSJ (2:52)
http://www.wsj.com/video/to-save-lives-look-up-from-your-phone/93FE25AF-17A8-4A2D-8C9B-4CD8EA73A7F1.html

Activity
Select a Christine Rosen TechnoSapiens podcast
Episode 1: As Twitter, Facebook, and all the other tech companies hoover up our
information,
learning more and more about us, is it time to ask: If we have
nothing to hide, do we
have nothing to fear?
Episode 2: Do fitness and diet tracking technology help to improve our physical
selves, or
should we be more hesitant about uploading such personal
information to the cloud?
Episode 3: Will massive open online courses (or MOOCs) will give us all greater
access to a
first-rate college educationor will they be the death knell for
higher learning as we
know it?
Episode 4: Have we let our obsession with grading everything from restaurants to
books on
sites like Yelp, TripAdvisor and Amazon undermining our expertise
and serendipity, or
are finally getting the facts rather than the overrated
opinions of critics?
Episode 5: Are we removing human error by letting algorithms take over
everything from the
stock market to driving, or are ceding too much control to
calculations that may have
serious flaws?
Episode 6: Will civilian and commercial drones will soon be as ubiquitous as the
Internet, or
will be compelled by privacy fears to curtail their use?
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Michael Wesch
Christine Rosen

Homework for Thu Oct 1 (class 4)


Is technology making us smarter?
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
better

Smarter Than You Think: How technology is changing our minds for the
(Clive Thompson, 2014)
13 reasons the web makes us smarter
David Weinberger, Huffington Post (2014)
Excerpts: How the internet makes us smarter

Video
Watch the assigned video on Zaption and post a one-sentence response.
Marshall Davis Jones: Touchscreen (3:12)
http://zapt.io/trdtmge6
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
How do you find cool things online?

https://canvas.instructure.com/courses/936923/discussion_topics/4017327

Interview: Clive Thompsons Smarter than you think


Michael Agger, The New Yorker (2013)

Youve written a book about how technology is changing our minds for the better. Have
readers been agreeing or disagreeing with you?

Ive had a lot of positive feedback to my discussion of ambient awarenesswhich is the


deep, rich, intellectual, and social connections we develop with each other via short-form
status updates. Most people have been trainedvia a parade of gloomy op-eds in their
newspapersto think of their online utterances as mere narcissism; that there could be no
conceivable value in tweeting or using Instagram or using Facebook, apart from a sort of
constant shilling of the self. So when I point out the interesting social science that underpins
some of the pleasures and values of persistent connection to each other, Ive found that
people are really excited about that. It resonates.

What do you see as the social good of Twitter?

For many of the more avid users, it provides a lot of new, useful things to think about
serendipitous stories, insights from others. Its a sort of global watercooler, with all the good
and bad that suggests. Its good because many folks feel like theyre immersed in an
interesting conversation thats going onand even if theyre just lurking, not actually talking
(studies show the majority of folks using Twitter are listening but not contributing that often),
they get exposed to all sorts of material theyd never see otherwise. And bad, because, well,
you can really get swept it in and distracted from work that youre supposed to be doing.

How do you think Twitter is going to change once it goes public?

The one complaint about the Internet that I wholeheartedly endorse is that most of these tools
have been designed to peck at us like ducks: Hey, theres a new reply to your comment!
Come look at it! And if you dont develop good skills of mindfulnesspaying attention to your
attentionit can really wind up colonizing much of your day. This is precisely why I suspect
Twitter going public will be bad for its users. The whole reason these services need to peck at
us like ducks is that their business models are built on advertising, and advertising wants as
many minutes of your day as possible. As the pressure builds for Twitter to make more money,
its pressure to redesign itself to interrupt us even more.

What do you think of the idea that the Internetas a virtual space where we can trade ideas,
art projects, and videos, and become enormously popular doing sohas replaced cities as the
centers of creative cultural ferment?

I dont think the Internet has replaced cities in any significant way, nor really could it. Cities
are dynamicand deeply seductive for the people who flock therebecause they broker all
sorts of fantastic and useful connections, cultural and economic and social. And the types of
face-to-face connections and serendipity that you get in a city are quite different from the
ones you get online. That said, there are deep similarities in the things we enjoy about cities
and the things we enjoy about the Internet! In both cases, the density of connections is what
brings the real intellectual fun and joy. Edward Glaeser famously argued that cities increase
the productivity and creativity of people within them. I suspect the Internet has a very similar
effect for folks who use it mindfully.

Nicholas Carr has written about how book-based learning taught us certain habits of mind, a
more empathetic way of thinking that we are rapidly losing with screens and screen-reading.
Do you agree?

I quite agree with Carr that tools affect how we thinkand considered as a tool, books have
many absolutely fantastic and magical effects on the way we think. They encourage us to slow
down, which is good; they synthesize large volumes of knowledge. But what Carr sells short
are the enormous benefits that come from social thinkingand social thinking is where the
Internet really shines. Theres an idea, popular with many text-based folkslike myself, and
many journalists and academicsthat reading books is thinking; that if youre not sitting for
hours reading a tome, youre not, in some essential way, thinking. This is completely false. A
huge amount of our everyday thinkingpowerful, creative, and resonant stuffis done
socially: talking to other people, arguing with them, relying on them to recall information for
us. This has been true for aeons in the offline world. But now we have new ways to think
socially onlineand to do so with likeminded folks around the world, which is still insanely
mind-blowing. It never stops being lovely for me. I was in a radio station the other day, and
while I was waiting to go on the air I watched the staff work. There were six or seven of them,
and they were all engaged in this incredibly complex activity thats behind the scenes of the
show: theyre talking about the next segment, writing down ideas, looking things up,
organizing the next batch of things the host is going to talk about. This is what thinking looks
like in the real world. A lot of it is incredibly, deeply social. And it has the effect of making the
host put on this much smarter, richer show than he or she could do on their own.

When people get into discussions and arguments online, whether its on Twitter or in a forum
about their favorite TV show or even in a thread underneath an Instagram photo, this is the
same thing transpiring. In the Phaedrus, Socrates worried that this dialogic nature of
knowledge would die out with text, because text was inert: you asked it a question, and it
couldnt answer back. What I love about the online world is that its pitched neatly between
those two poles.

Everyone is staring at their phones all the time. Are we a generation that has been
overwhelmed by a device we cant handle?

No, I dont think were doomed to be overwhelmed. We have a long track record of adapting to
the challenges of new technologies and new media, and of figuring out self-control. Cities,
considered as a sort of technology in themselves, were enormously overstimulating and
baffling for the new residents when the West began urbanizing, in the nineteenth century.

More recently, we tamed our addition to talking incessantly on mobile phones. People forget
this, but when mobile phones came along, in the nineties, people were so captivated by the
idea that you could talk to someone elseanywhereon the sidewalk, on a mountaintop
that they answered them every single time they rang. It took ten years, and a ton of quite
useful scrutinyand mockery of our own poor behaviorto pull back.

What are the downsides of ambient contact? Besides knowing way too much about sports
scores and who has eaten a cronut?

I think the big downside of todays ambient contact is that it makes us too present-focussed.
Psychologists talk about something called recencyour tendency to assume that whatever
is happening to us right now is the most important thing going on. Its a long-standing bias in
our psychology, long predating the Internet. But modern media have made it worse. By
modern Im beginning with, probably, the telegraph, and certainly the newspaper. When you
read the novels of the late nineteenth century and the early twentieth century, theyre already
complaining about people being far too fascinated with the events of the day instead of
paying attention to history.

And this got seismically worse once cable TV realized that you could keep everyone riveted to
their seat with live coverage of basically anything. Todays self-publishing tools, almost from
the get-go, were designed to privilege the present and ignore the past. When blogs first
became popular, they were all organized in reverse chronology, with the most recent post at
the top, the older ones fading into the background, and the clear implication of that design is
that whats written today is more important than what was written last week or last year. That
design has carried over into basically every tool of social media. And, again, because most of
the big social-media tools are paid for by advertising, they have even more economic impetus
to reinforce recency in their design. They want us to be constantly refreshing the feed over
and over again, because thatll give them more eyeballs to which to sell ads. What this
suggests, though, is that one could design all sorts of quite delightful tools for expression and
contact that didnt prize recency. If you founded a social network that charged a minimal
amount of money, for example, you wouldnt need ads at all, and suddenly the economic need
to reinforce recency is gone. Facebook only makes five dollars a year off of each user. Thats
actually an amazingly piddling amount, when you think about it.

Digital tools have a mind of their own: yours


Clive Thompson, reprinted from Smarter Than You Think (boingboing.net, 2014)
What are the central biases of todays digital tools? There are many, but I see three big ones that
have a huge impact on our cognition.
First, they allow for prodigious external memory: smartphones, hard drives, cameras, and
sensors
routinely record more information than any tool before them. Were shifting from a stance
of rarely
recording our ideas and the events of our lives to doing it habitually.
Second, todays tools make it easier for us to find connectionsbetween ideas, pictures, people,
bits
of newsthat were previously invisible.
Third, they encourage a superfluity of communication and publishing.
This last feature has many surprising effects that are often ill understood. Any economist can tell
you that when you suddenly increase the availability of a resource, people do more things with
it, which also means they do increasingly unpredictable things. As electricity became cheap and
ubiquitous in the West, its role expanded from things youd expectlike nighttime lightingto
the unexpected and seemingly trivial: battery driven toy trains, electric blenders, vibrators. The
superfluity of communication today has produced everything from a rise in crowd-organized
projects like Wikipedia to curious new forms of expression: television-show recaps, map-based
storytelling, discussion threads that spin out of a photo posted to a smartphone app, Amazon
product-review threads wittily hijacked for political satire. Now, none of these three digital biases
is immutable, because theyre the product of software and hardware, and can easily be altered
or ended if the architects of todays tools (often corporate and governmental) decide to regulate
the tools or find theyre not profitable enough.
But right now, these big effects dominate our current and near-term landscape. In one sense,
these three shiftsinfinite memory, dot connecting, explosive publishingare screamingly
obvious to anyone whos ever used a computer. Yet they also somehow constantly surprise us by
producing ever-new tools for thought (to use the writer Howard Rheingolds lovely phrase) that
upend our mental habits in ways we never expected and often dont apprehend even as they
take hold. Indeed, these phenomena have already woven themselves so deeply into the lives of
people around the globe that its difficult to stand back and take account of how much things
have changed and why. Though this can be mapped out as the future of thought, its also frankly
rooted in the present, because many parts of our future have already arrived, even if they are
only dimly understood. As the sci-fi author William Gibson famously quipped: The future is
already hereits just not very evenly distributed.

Benefits of reading on the web


Mark Bauerlein, The Dumbest Generation (2008)

Reading on the web:


permits nonlinear strategies of thinking
allows nonhierarchical strategies
offers nonsequential strategies
requires visual literacy skills to understand multimedia components
is interactive, with the reader able to add, change, or move text
enables a blurring of the relationship between reader and writer
New technologies = new aptitudes?
Mark Bauerlein, The Dumbest Generation (2008)
New technologies induce new aptitudes, and bundled together in the bedroom they push
consciousness to diversify its attention and multiply its communications., Young Americans
join virtual communities, cultivating interests and voicing opinions. Video games quicken
their spatial intelligence. Group endeavors such as Wikipedia and reality gaming nurture
collaborative problem---solving skills. Individuals whove grown up surrounded by technology
develop different hard-wiring, their minds adapted to information and entertainment
practices and speeds that minds maturing in pre-digital habitats can barely comprehend,
much less assimilate. Screen time is cerebral, and it generates a breakthrough intelligence.
E-literacy isnt just knowing how to download music, program an iPod, create a virtual profile,
and comment on a blog. Its a general deployment capacity, a particular mental flexibility. Eliteracy accommodates hypermedia because e-literates possess hyperalertness. Multitasking
entails a special cognitive attitude toward the world, not the orientation that enables slow
concentration on one thinga sonnet, a theorembut a lightsome, itinerant awareness of
numerous and dissimilar inputs. In a white paper entitled Education for the 21st Century,
MIT professor Henry Jenkins sketches the new media literacies in precisely such big, brainy
terms: distributed cognitionthe ability to interact meaningfully with tools that expand
mental capacities (search engines, etc.); collective intelligence the ability to pool
knowledge and compare notes with others toward a common goal; transmedia navigation
the ability to follow the flow of stories and information across multiple modalities.
Net-Gen Skills
Sam Anderson, New York Magazine (2009)
Digital Natives are able to switch between attentional targets in a way thats been considered
impossible. As we become more skilled at the 21st-century task Meyer calls flitting, the
wiring of the brain will inevitably change to deal more efficiently with more information. The
neuroscientist Gary Small speculates that the human brain might be changing faster today
than it has since the prehistoric discovery of tools. Research suggests were already picking up
new skills: better peripheral vision, the ability to sift information rapidly. Kids growing up now
might have an associative genius we donta sense of the way ten projects all dovetail into
something totally new. They might be able to engage in seeming contradictions: mindful websurfing, mindful Twittering.
Shifts that support digital learning
Will Richardson, Personal Learning Comprehension (2011)

Shifts that are supporting connection and network building:


Analog to digital
Tethered to mobile
Isolated to connected
Generic to personal
Consumption to creation
Closed systems to open systems

Learning today is not about memorizing, it is about finding and synthesizing


Will Richardson, Personal Learning Comprehension (2011)
Learning today is not about memorizing facts. An integral part of the learning process is to be
able to find and synthesize the most current information and recognize connections between
ideas that may be found in many different places from many different people. Since learning is
an ongoing process and no longer an event, our ability to expand our knowledge is more
important than we currently realize. (Siemens, 2007). Young people today use the web to
connect in interest-based ways, which, as the name suggests, is all about their passions to learn;
whether its fixing up that 78 Camaro, finding ways to clean up the environment, or learning
how to build an awesome new skateboard, kids are beginning to engage in these networked
online spaces on their own. As the lead author of the study, Mimi Ito writes, Kids learn on the
Internet in a self-directed way, by looking around for information they are interested in, or
connecting with others who can help them. This is a big departure from how they are asked to
learn in most schools (Ito et Al., 2008).
Learning today is about creation, not consumption
Will Richardson, Personal Learning Comprehension (2011)
Learning in the new world is all about creation, not consumption. Certainly, we continue to spend
a large amount of our learning time reading, thinking, and synthesizing ideas. Now, however, we
dont just consume those ideas: we share them. As Clay Shirky suggests in Cognitive Surplus
(2010), we are in the process of taking the roughly two hundred billion collective hours per year
we spend in front of the television set (in the U.S. alone) and turning them into creative acts,
some more foolish and inane than others, but creative nonetheless.
Do we need to be distracted?
Cathy Davidson, Now You See It (2011)
The new brain science helps us to re-ask the old questions about attention in a new ways. What are
we learning by being open to multitasking? What new muscles are we exercising? What new neural
pathways are we shaping, what old ones are we shearing, what new unexpected patterns are we
creating? And how can we help one another out by collaborative multitasking? We know that in
dreams, as in virtual worlds and digital spaces, physical and even traditional linear narrative rules do
not apply. It is possible that, during boundless wandering thinking, we open ourselves to possibilities
for innovative solutions that, in more focused thinking, we might prematurely preclude as unrealistic.
The Latin word for inspiration is inspirare, to inflame or breathe into. What if we thought of new
digital ways of thinking not as multitasking but multi-inspiring, as potentially creative disruption of
usual thought patterns. Look at the account of just about any enormous intellectual breakthrough
and youll find that some seemingly random connection, some associational side thought, some
distraction preceded the revelation. Distraction, we may discover, is as central to innovation as, say,
an apple falling on Newtons head.
The Wisdom of Crowds
Jesse Rice, The Church of Facebook: How the Hyperconnected are Redefining Reality (2009)
In his fascinating book, The Wisdom of Crowds, James Surowiecki makes a strong case for the
benefits of access to a large decision-informing audience. Surowiecki argues that individuals
tend to make better decisions when informed by a crowd than when they make decisions in
isolations, even when they are experts on the matter. Our network of Facebook friends makes for
a natural crowd that is easily accessed for use in weighing decisions, and their input may often
prove helpful.

13 reasons the web makes us smarter


David Weinberger, Huffington Post (2014)
Let's get the obvious out of the way
* You can look up just about anything you want.
* Sites like WolframAlpha compute answers to complex questions.
* You can find pages, people, and communities that let you dive deep into just about any topic.
* You can make everything you know, think, or invent available to the world for free.
Not bad for starters.
Web-form thought
* Knowledge takes on a different shape when its medium is hyperlinked.
* Books have favored long-form, sequential chains of thought that lead readers to the author's
conclusion.
That's one useful way of thinking, but it reflects the limitations of paper. The
author has to try to keep
us on the bus rather than letting us explore more widely because
paper knowledge is hard to traverse.
The author has to anticipate objections, rather than
entering into real-time conversation with readers,
because paper knowledge is only made
public once it's done.
* And it has given us the overly-simplistic idea that a world as complex and chaotic as ours
ultimately
reduces to long, knowable sequences of logic.
* Networked knowledge instead lives in webs of hyperlinked ideas -- some of which may indeed
be long
form arguments -- that explain, argue, differentiate, and extend ideas. There is more
value in these
webs of knowledge than there is in the individual expressions...although it's
also true that these webs
inevitably include dumb and venal misunderstandings.
We're getting more comfortable with unsettled differences
* Knowledge's new medium literally consists of linked differences: people saying different things,
held together by hyperlinks. This runs against our traditional idea that knowledge consists of
that about
which there is no longer disagreement.
* We're learning how to deal with the inevitability of disagreement. For example, threaded
conversations can
fork when the discussion goes off topic or the argument goes on past the
patience of most of the
members.
Experts are becoming networked
* Increasingly businesses are realizing that there is more truth and value in being plugged into a
network of
experts who disagree than in relying upon one single big-brained person who is
paid to deliver The Truth in a fat, expensive report.
An ethos of sharing is enabling incredibly rapid learning
* Quite possibly the fastest-learning profession these days is software engineering. Have a highly
specific
question about why your code isn't working or how you can do something that
your programming language seems not to support? Search engines will bring you immediately
to sites where questions
are answered and code is shared. Need a tutorial or an advanced
course? No problem.
* Someday we'll all learn the way software developers do.
Clouds of data are letting us study otherwise impossible subjects
* In field after field, the Internet is enabling us to assemble and access unthinkably large clouds
of data. This is a game changer. It enables us to explore with rigor phenomena previously
beyond the scope of science. To take one example: human social behavior.
* Along with Big Data comes another change: When that data is expressed in the Linked Data
format -- as proposed by Sir Tim Berners Lee -- facts change from bricks to links that can be
hooked together
across disciplines.
* A messy Web holds more information than a well-ordered one.

We are inventing new ways to engage with knowledge


* We're inventing new ways to investigate and research.
* For just one example, take a look at what's going on at Reddit.com. Sure, the site often seems
like a boys
club, and people have put up some truly vile pages. But this online discussion
forum has invented (or
at least named and codified) a really interesting form of community
journalism.
* Called "IAMA," it lets everyone and anyone ask a question of someone in the news, an
entertainer, someone who has an interesting job, a different point of view or an unusual
experience. The questions are blunt, and the community spins out threads in response to
every answer. It's not
brand new, and it's got problems. But at its best, it can be more
informative than the work done by professionals.
Knowledge is weaving back into our humanity
* While you're at Reddit, you'll notice that the threads that get spun in response to answers often
are silly,
and surprisingly frequently hilarious. That's how knowledge sounds when it's put
back into the human context.
* In fact, it's hard to prevent knowledge from getting humanized on the Net, because the
Internet is one of
the few media in our history that we use equally well for information,
communication, and sociality.
You post a simple bit of information and it's likely to turn into
a conversation, out of which comes a
social bond. You can't keep these apart, even when it
comes to knowledge.
Knowing is itself becoming more transparent
* We get to see how our culture is absorbing knowledge, like watching dye works its way through
a
circulatory system. And that itself is useful knowledge.
An interest-driven world
* If Nicholas Carr was right to title his book about the Internet The Shallows, then we should call
the old
media "The Narrows."
* The old media, even with the best of intentions, could only show us a tiny bit of the world, so it
became
expert at showing us what it thought we would find interesting.
* The Web, on the other hand, is a genuine expression of our interests. We built it by linking, one
hyperlink at a time, to what actually matters to us. For better and for worse.
* And because of this everyone on the Net learns a crucial fact: The world is much more
interesting than
anyone ever told us.
* That's the fact where knowledge begins. That's why the pursuit of knowledge continues. That's
why, for all
of the dangers, those who care about knowledge are so excited about this Net
we've built together.

In-class resources for Thu Oct 1 (class 4)


Is technology making us smarter?
Digerati video and reading
(discussion leader: x)
Digerati
Clive Thompson
@pomeranian99
BHSEC Q guest author (Spring 2012)
http://smarterthanyouthink.net/bio/
Video
We will watch and discuss this in class.
Clive Thompson on CBS Morning (3:45)
http://www.cbsnews.com/videos/is-the-internet-making-us-smarter/
Reading
We will read and discuss this in class.
Clive Thompson interviewed by Michael Agger about Smarter Than You Think (New Yorker, 2013)
(in the class 4 assigned reading)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
How technology makes us smarter
How might we use technology to enhance our bodies?
Clive Thompson

Sunday Story #3
Due Sun Oct 4 (before midnight)
Prompt

medium notes-carr article

Homework for Tue Oct 6 (class 5)


Is technology making us dumber?
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Is Google making us stupid?
(Nicholas Carr, The Atlantic, 2008)
The Myth of Multitasking
(Christine Rosen, The New Atlantis, 2008)
Excerpts: How technology makes us dumber
Video
Watch the assigned video on Zaption and post a one-sentence response.
Aaron Swartz: The Internets Own Boy (8:18)
http://zapt.io/tzp8r4d4
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet

IsGoogleMakingUsStupid?

theatlantic.com

Nicholas Carr (2008)


"Dave, stop. Stop, will you? Stop, Dave.
Will you stop, Dave? So the
supercomputer HAL pleads with the
implacable astronaut Dave Bowman in a
famous and weirdly poignant scene toward
the end of Stanley Kubricks 2001:ASpace
Odyssey. Bowman, having nearly been
sent to a deep space death by the
malfunctioning machine, is calmly, coldly
disconnecting the memory circuits that
control its artificial brain. Dave, my mind
is going, HAL says, forlornly. I can feel it. I
can feel it.
I can feel it, too. Over the past few years
Ive had an uncomfortable sense that someone, or something, has been tinkering with my brain,
remapping the neural circuitry, reprogramming the memory. My mind isnt goingso far as I can tell
but its changing. Im not thinking the way I used to think. I can feel it most strongly when Im reading.
Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the
narrative or the turns of the argument, and Id spend hours strolling through long stretches of prose.
Thats rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get
fidgety, lose the thread, begin looking for something else to do. I feel as if Im always dragging my
wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know whats going on. For more than a decade now, Ive been spending a lot of time online,
searching and surfing and sometimes adding to the great databases of the Internet. The Web has been
a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of
libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and Ive
got the telltale fact or pithy quote I was after. Even when Im not working, Im as likely as not to be
foraging in the Webs info thicketsreading and writing e mails, scanning headlines and blog posts,
watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to
which theyre sometimes likened, hyperlinks dont merely point to related works they propel you toward
them.)
For me, as for others, the Net is becoming a universal medium, the conduit for most of the information
that flows through my eyes and ears and into my mind. The advantages of having immediate access to
such an incredibly rich store of information are many, and theyve been widely described and duly
applauded. The perfect recall of silicon memory, Wireds Clive Thompson has written, can be an
enormous boon to thinking. But that boon comes at a price. As the media theorist Marshall McLuhan
pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of
thought, but they also shape the process of thought. And what the Net seems to be doing is chipping
away my capacity for concentration and contemplation. My mind now expects to take in information the
way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of
words. Now I zip along the surface like a guy on a Jet Ski.
Im not the only one. When I mention my troubles with reading to friends and acquaintancesliterary
types, most of themmany say theyre having similar experiences. The more they use the Web, the
more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have

also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently
confessed that he has stopped reading books altogether. I was a lit major in college, and used to be [a]
voracious book reader, he wrote. What happened? He speculates on the answer: What if I do all my
reading on the web not so much because the way I read has changed, i.e. Im just seeking
convenience, but because the way I THINK has changed?
Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how
the Internet has altered his mental habits. I now have almost totally lost the ability to read and absorb a
longish article on the web or in print, he wrote earlier this year. A pathologist who has long been on the
faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a
telephone conversation with me. His thinking, he said, has taken on a staccato quality, reflecting the
way he quickly scans short passages of text from many sources online. I cant read WarandPeace
anymore, he admitted. Ive lost the ability to do that. Even a blog post of more than three or four
paragraphs is too much to absorb. I skim it.
Anecdotes alone dont prove much. And we still await the long term neurological and psychological
experiments that will provide a definitive picture of how Internet use affects cognition. But a recently
published study of online research habits , conducted by scholars from University College London,
suggests that we may well be in the midst of a sea change in the way we read and think. As part of the
five year research program, the scholars examined computer logs documenting the behavior of visitors
to two popular research sites, one operated by the British Library and one by a U.K. educational
consortium, that provide access to journal articles, e books, and other sources of written information.
They found that people using the sites exhibited a form of skimming activity, hopping from one source
to another and rarely returning to any source theyd already visited. They typically read no more than
one or two pages of an article or book before they would bounce out to another site. Sometimes theyd
save a long article, but theres no evidence that they ever went back and actually read it. The authors of
the study report:
It is clear that users are not reading online in the traditional sense indeed there are signs
that new forms of reading are emerging as users power browse horizontally through
titles, contents pages and abstracts going for quick wins. It almost seems that they go
online to avoid reading in the traditional sense.
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text messaging on cell
phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our
medium of choice. But its a different kind of reading, and behind it lies a different kind of thinking
perhaps even a new sense of the self. We are not only what we read, says Maryanne Wolf, a
developmental psychologist at Tufts University and the author of ProustandtheSquid:TheStoryand
ScienceoftheReadingBrain. We are how we read. Wolf worries that the style of reading promoted by
the Net, a style that puts efficiency and immediacy above all else, may be weakening our capacity for
the kind of deep reading that emerged when an earlier technology, the printing press, made long and
complex works of prose commonplace. When we read online, she says, we tend to become mere
decoders of information. Our ability to interpret text, to make the rich mental connections that form
when we read deeply and without distraction, remains largely disengaged.
Reading, explains Wolf, is not an instinctive skill for human beings. Its not etched into our genes the
way speech is. We have to teach our minds how to translate the symbolic characters we see into the
language we understand. And the media or other technologies we use in learning and practicing the
craft of reading play an important part in shaping the neural circuits inside our brains. Experiments
demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that
is very different from the circuitry found in those of us whose written language employs an alphabet. The
variations extend across many regions of the brain, including those that govern such essential cognitive
functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the

circuits woven by our use of the Net will be different from those woven by our reading of books and
other printed works.
Sometime in 1882, Friedrich Nietzsche bought a typewritera Malling Hansen Writing Ball, to be
precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and
painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared
that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had
mastered touch typing, he was able to write with his eyes closed, using only the tips of his fingers.
Words could once again flow from his mind to the page.
But the machine had a subtler effect on his work. One of Nietzsches friends, a composer, noticed a
change in the style of his writing. His already terse prose had become even tighter, more telegraphic.
Perhaps you will through this instrument even take to a new idiom, the friend wrote in a letter, noting
that, in his own work, his thoughts in music and language often depend on the quality of pen and
paper.
You are right, Nietzsche replied, our writing equipment takes part in the forming of our thoughts.
Under the sway of the machine, writes the German media scholar Friedrich A. Kittler , Nietzsches prose
changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.
The human brain is almost infinitely malleable. People used to think that our mental meshwork, the
dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by
the time we reached adulthood. But brain researchers have discovered that thats not the case. James
Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George
Mason University, says that even the adult mind is very plastic. Nerve cells routinely break old
connections and form new ones. The brain, according to Olds, has the ability to reprogram itself on
the fly, altering the way it functions.
As we use what the sociologist Daniel Bell has called our intellectual technologiesthe tools that
extend our mental rather than our physical capacitieswe inevitably begin to take on the qualities of
those technologies. The mechanical clock, which came into common use in the 14th century, provides a
compelling example. In TechnicsandCivilization, the historian and cultural critic Lewis Mumford
described how the clock disassociated time from human events and helped create the belief in an
independent world of mathematically measurable sequences. The abstract framework of divided time
became the point of reference for both action and thought.
The clocks methodical ticking helped bring into being the scientific mind and the scientific man. But it
also took something away. As the late MIT computer scientist Joseph Weizenbaum observed in his
1976 book, ComputerPowerandHumanReason:FromJ udgmenttoCalculation, the conception of the
world that emerged from the widespread use of timekeeping instruments remains an impoverished
version of the older one, for it rests on a rejection of those direct experiences that formed the basis for,
and indeed constituted, the old reality. In deciding when to eat, to work, to sleep, to rise, we stopped
listening to our senses and started obeying the clock.
The process of adapting to new intellectual technologies is reflected in the changing metaphors we use
to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their
brains as operating like clockwork. Today, in the age of software, we have come to think of them as
operating like computers. But the changes, neuroscience tells us, go much deeper than metaphor.
Thanks to our brains plasticity, the adaptation occurs also at a biological level.
The Internet promises to have particularly far reaching effects on cognition. In a paper published in
1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed
only as a theoretical machine, could be programmed to perform the function of any other information
processing device. And thats what were seeing today. The Internet, an immeasurably powerful

computing system, is subsuming most of our other intellectual technologies. Its becoming our map and
our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.
When the Net absorbs a medium, that medium is re created in the Nets image. It injects the mediums
content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the
content of all the other media it has absorbed. A new e mail message, for instance, may announce its
arrival as were glancing over the latest headlines at a newspapers site. The result is to scatter our
attention and diffuse our concentration.
The Nets influence doesnt end at the edges of a computer screen, either. As peoples minds become
attuned to the crazy quilt of Internet media, traditional media have to adapt to the audiences new
expectations. Television programs add text crawls and pop up ads, and magazines and newspapers
shorten their articles, introduce capsule summaries, and crowd their pages with easy to browse info
snippets. When, in March of this year, TheNewYorkTimes decided to devote the second and third
pages of every edition to article abstracts , its design director, Tom Bodkin, explained that the
shortcuts would give harried readers a quick taste of the days news, sparing them the less efficient
method of actually turning the pages and reading the articles. Old media have little choice but to play by
the new media rules.
Never has a communications system played so many roles in our livesor exerted such broad
influence over our thoughtsas the Internet does today. Yet, for all thats been written about the Net,
theres been little consideration of how, exactly, its reprogramming us. The Nets intellectual ethic
remains obscure.
About the same time that Nietzsche started using his typewriter, an earnest young man named
Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a
historic series of experiments aimed at improving the efficiency of the plants machinists. With the
approval of Midvales owners, he recruited a group of factory hands, set them to work on various
metalworking machines, and recorded and timed their every movement as well as the operations of the
machines. By breaking down every job into a sequence of small, discrete steps and then testing
different ways of performing each one, Taylor created a set of precise instructionsan algorithm, we
might say todayfor how each worker should work. Midvales employees grumbled about the strict new
regime, claiming that it turned them into little more than automatons, but the factorys productivity
soared.
More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last
found its philosophy and its philosopher. Taylors tight industrial choreographyhis system, as he
liked to call itwas embraced by manufacturers throughout the country and, in time, around the world.
Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time and
motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor
defined it in his celebrated 1911 treatise, ThePrinciplesofScientificManagement, was to identify and
adopt, for every job, the one best method of work and thereby to effect the gradual substitution of
science for rule of thumb throughout the mechanic arts. Once his system was applied to all acts of
manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but
of society, creating a utopia of perfect efficiency. In the past the man has been first, he declared in
the future the system must be first.
Taylors system is still very much with us it remains the ethic of industrial manufacturing. And now,
thanks to the growing power that computer engineers and software coders wield over our intellectual
lives, Taylors ethic is beginning to govern the realm of the mind as well. The Internet is a machine
designed for the efficient and automated collection, transmission, and manipulation of information, and
its legions of programmers are intent on finding the one best methodthe perfect algorithmto carry
out every mental movement of what weve come to describe as knowledge work.

Googles headquarters, in Mountain View, Californiathe Googleplexis the Internets high church,
and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is
a company thats founded around the science of measurement, and it is striving to systematize
everything it does. Drawing on the terabytes of behavioral data it collects through its search engine and
other sites, it carries out thousands of experiments a day, according to the HarvardBusinessReview,
and it uses the results to refine the algorithms that increasingly control how people find information and
extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the
mind.
The company has declared that its mission is to organize the worlds information and make it
universally accessible and useful. It seeks to develop the perfect search engine, which it defines as
something that understands exactly what you mean and gives you back exactly what you want. In
Googles view, information is a kind of commodity, a utilitarian resource that can be mined and
processed with industrial efficiency. The more pieces of information we can access and the faster we
can extract their gist, the more productive we become as thinkers.
Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while
pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their
search engine into an artificial intelligence, a HAL like machine that might be connected directly to our
brains. The ultimate search engine is something as smart as peopleor smarter, Page said in a
speech a few years back. For us, working on search is a way to work on artificial intelligence. In a
2004 interview with Newsweek, Brin said, Certainly if you had all the worlds information directly
attached to your brain, or an artificial brain that was smarter than your brain, youd be better off. Last
year, Page told a convention of scientists that Google is really trying to build artificial intelligence and to
do it on a large scale.
Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast
quantities of cash at their disposal and a small army of computer scientists in their employ. A
fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidts
words, to solve problems that have never been solved before, and artificial intelligence is the hardest
problem out there. Why wouldnt Brin and Page want to be the ones to crack it?
Still, their easy assumption that wed all be better off if our brains were supplemented, or even
replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a
mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In
Googles world, the world we enter when we go online, theres little place for the fuzziness of
contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an
outdated computer that needs a faster processor and a bigger hard drive.
The idea that our minds should operate as high speed data processing machines is not only built into
the workings of the Internet, it is the networks reigning business model as well. The faster we surf
across the Webthe more links we click and pages we viewthe more opportunities Google and other
companies gain to collect information about us and to feed us advertisements. Most of the proprietors of
the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we
flit from link to linkthe more crumbs, the better. The last thing these companies want is to encourage
leisurely reading or slow, concentrated thought. Its in their economic interest to drive us to distraction.
Maybe Im just a worrywart. Just as theres a tendency to glorify technological progress, theres a
countertendency to expect the worst of every new tool or machine. In Platos Phaedrus, Socrates
bemoaned the development of writing. He feared that, as people came to rely on the written word as a
substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the
dialogues characters, cease to exercise their memory and become forgetful. And because they would
be able to receive a quantity of information without proper instruction, they would be thought very
knowledgeable when they are for the most part quite ignorant. They would be filled with the conceit of

wisdom instead of real wisdom. Socrates wasnt wrongthe new technology did often have the effects
he fearedbut he was shortsighted. He couldnt foresee the many ways that writing and reading would
serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).
The arrival of Gutenbergs printing press, in the 15th century, set off another round of teeth gnashing.
The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to
intellectual laziness, making men less studious and weakening their minds. Others argued that
cheaply printed books and broadsheets would undermine religious authority, demean the work of
scholars and scribes, and spread sedition and debauchery. As New York University professor Clay
Shirky notes, Most of the arguments made against the printing press were correct, even prescient.
But, again, the doomsayers were unable to imagine the myriad blessings that the printed word would
deliver.
So, yes, you should be skeptical of my skepticism. Perhaps those who dismiss critics of the Internet as
Luddites or nostalgists will be proved correct, and from our hyperactive, data stoked minds will spring a
golden age of intellectual discovery and universal wisdom. Then again, the Net isnt the alphabet, and
although it may replace the printing press, it produces something altogether different. The kind of deep
reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire
from the authors words but for the intellectual vibrations those words set off within our own minds. In
the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of
contemplation, for that matter, we make our own associations, draw our own inferences and analogies,
foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.
If we lose those quiet spaces, or fill them up with content, we will sacrifice something important not
only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently
described whats at stake:
I come from a tradition of Western culture, in which the ideal (my ideal) was the complex,
dense and cathedral like structure of the highly educated and articulate personalitya
man or woman who carried inside themselves a personally constructed and unique version
of the entire heritage of the West. [But now] I see within us all (myself included) the
replacement of complex inner density with a new kind of selfevolving under the pressure
of information overload and the technology of the instantly available.
As we are drained of our inner repertory of dense cultural inheritance, Foreman concluded, we risk
turning into pancake peoplespread wide and thin as we connect with that vast network of
information accessed by the mere touch of a button.
Im haunted by that scene in 2001. What makes it so poignant, and so weird, is the computers
emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its
childlike pleading with the astronautI can feel it. I can feel it. Im afraidand its final reversion to
what can only be called a state of innocence. HALs outpouring of feeling contrasts with the
emotionlessness that characterizes the human figures in the film, who go about their business with an
almost robotic efficiency. Their thoughts and actions feel scripted, as if theyre following the steps of an
algorithm. In the world of 2001, people have become so machinelike that the most human character
turns out to be a machine. Thats the essence of Kubricks dark prophecy: as we come to rely on
computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial
intelligence.

The Struggle for Attentional Self-Control


Sam Anderson, New York Magazine (2009)
It all comes down to the problem of jackhammers. A few minutes before I called, she tells me, a
construction crew started jackhammering outside her apartment window. The noise immediately
captured whats called her bottom-up attentionthe broad involuntary awareness that roams
the world constantly looking for danger and rewards: shiny objects, sudden movements, pungent
smells. Instead of letting this distract her, however, she made a conscious choice to go into the
next room and summon her top-down attentionthe narrow, voluntary focus that allows us to
isolate and enhance some little slice of the world while ruthlessly suppressing everything else.
This attentional self-control, which psychologists call executive function, is at the very center of
our struggle with attention. Its what allows us to invest our focus wisely or poorly. Some of us, of
course, have an easier time with it than others.
Continuous partial attention
Sam Anderson, New York Magazine (2009)
The tech theorist Linda Stone famously coined the phrase continuous partial
attention to describe our newly frazzled state of mind. American office workers dont
stick with any single task for more than a few minutes at a time; if left uninterrupted, they will
most likely interrupt themselves. Since every interruption costs around 25 minutes of
productivity, we spend nearly a third of our day recovering from them. We keep an average of
eight windows open on our computer screens at one time and skip between them every twenty
seconds. When we read online, we hardly even read at allour eyes run down the page in an F
pattern, scanning for keywords.
The Myth of Multitasking
Sam Anderson, New York Magazine (2009)
Over the last twenty years, Meyer and a host of other researchers have proved again
and again that multitasking, at least as our culture has come to know and love and
institutionalize it, is a myth. When you think youre doing two things at once, youre almost
always just switching rapidly between them, leaking a little mental efficiency with every switch.
Meyer says that this is because the brain processes different kinds of information on a variety of
separate channelsa language channel, a visual channel, an auditory channel, and so on
each of which can process only one stream of information at a time. If you overburden a channel,
the brain becomes inefficient and mistake-prone. The classic example is driving while talking on
a cell phone, two tasks that conflict across a range of obvious channels: Steering and dialing are
both manual tasks, looking out the windshield and reading a phone screen are both visual, etc.
Information overload => negative effects on decision making?
John Palfrey, Born Digital (2008)
Life experience suggests that more information increases the overall quality of decisions. If a
decision-maker gets too little information, he or she cant see the full picture and runs the risk of
making a decision without having taken important information into account. But the positive
correlation between the amount of information and the quality of decision-making has
limitations. At some point, additional information cannot be processed and integrated. In fact,
the extra information may result in information overload, with consequences that include
confusion, frustration, panic, or even paralysis. As behavioral economics teaches, the more the
options, the greater the chance that a person will make no decision at all, as studies have shown
in many different contexts.
Constant Stimulation
Matt Richtel, The New York Times (2010)
At the University of California, San Francisco, scientists have found that when rats have a new
experience, like exploring an unfamiliar area, their brains show new patterns of activity. But only
when the rats take a break from their exploration do they process those patterns in a way that

seems to create a persistent memory. In that vein, recent imaging studies of people have found
that major cross sections of the brain become surprisingly active during downtime. These brain
studies suggest to researchers that periods of rest are critical in allowing the brain to
synthesize information, make connections between ideas and even develop the sense
of self. Researchers say these studies have particular implications for young people, whose
brains have more trouble focusing and setting priorities. Downtime is to the brain what sleep is
to the body, said Dr. Rich of Harvard Medical School. But kids are in a constant mode of
stimulation.
White moths
David Shenk, Data Smog (1997)
Physically, we are what we are. So while we like to think of humans as adaptable creatures, the
plain truth is that because of our complexity and longevity, we arent nearly as quick to
physically adapt as are many other species. In the nineteenth century, the thick smoke from
factories in England annihilated white lichens covering tree bark, rendering the previously wellcamouflaged white Peppered Moth extremely vulnerable to bird predators. But in just a few
years, the previously rare black moths from the same species became dominant, and the
Peppered Moth was saved. In recent years, as factory soot has waned in that area and the tree
bark has subsequently become light again, evolution has pulled a quick about-face: The moths,
too, are white again. Human evolution, for better or worse, is not so swift; because of this, we
may not be able to keep pace with our own technology. Our brains have remained structurally
consistent for over 50,000 years, yet exposure to processed information in this century has
increased by a factor of thousands (lately, the volume and speed of information has been
increasing as much as 100 percent each year). Something has to give.
Automation is making us dumb
Nicholas Carr, WSJ (2014)
The first wave of automation rolled through U.S. industry after World War II, when manufacturers
began installing electronically controlled equipment in their plants. The new machines made
factories more efficient and companies more profitable. They were also heralded as
emancipators. By relieving factory hands of routine chores, they would do more than boost
productivity. They would elevate laborers, giving them more invigorating jobs and more valuable
talents. The new technology would be ennobling. Then, in the 1950s, a Harvard Business School
professor named James Bright went into the field to study automations actual effects on a
variety of industries, from heavy manufacturing to oil refining to bread baking. Factory
conditions, he discovered, were anything but uplifting. More often than not, the new machines
were leaving workers with drabber, less demanding jobs. An automated milling machine, for
example, didnt transform the metalworker into a more creative artisan; it turned him into a
pusher of buttons. Bright concluded that the overriding effect of automation was (in the jargon of
labor economists) to de-skill workers rather than to up-skill them. The lesson should be
increasingly clear, he wrote in 1966. Highly complex equipment did not require skilled
operators. The skill can be built into the machine. We are learning that lesson again today on a
much broader scale. As software has become capable of analysis and decision-making,
automation has leapt out of the factory and into the white-collar world. Computers are taking
over the kinds of knowledge work long considered the preserve of well-educated, well-trained
professionals: Pilots rely on computers to fly planes; doctors consult them in diagnosing
ailments; architects use them to design buildings. Worrisome evidence suggests that our own
intelligence is withering as we become more dependent on the artificial variety. Rather than
lifting us up, smart software seems to be dumbing us down. Automation traps people in a vicious
cycle of de-skilling. By isolating them from hard work, it dulls their skills and increases the odds
that they will make mistakes. When those mistakes happen, designers respond by seeking to
further restrict peoples responsibilitiesspurring a new round of de-skilling.
Is Google making students stupid?
Nick Romeo, The Atlantic (2014)
When do technologies free students to think about more interesting and complex questions, and
when do they erode the very cognitive capacities they are meant to enhance? The effect of
ubiquitous spell check and AutoCorrect software is a revealing example. Psychologists studying

the formation of memories have found that the act of generating a word in your mind
strengthens your capacity to remember it. When a computer automatically corrects a spelling
mistake or offers a drop-down menu of options, were no longer forced to generate the correct
spelling in our minds. This might not seem very important. If writers dont clutter their minds
with often-bizarre English spelling conventions, this might give them more energy to consider
interesting questions of style and structure. But the process of word generation is not just
supplementing spelling skills; its also eroding them. When students find themselves without
automated spelling assistance, theyre more likely to make errors. The solution might seem to be
improving battery life and making spelling assistance even more omnipresent, but this creates a
vicious cycle: The more we use the technology, the more we need to use it in all circumstances.
Suddenly, our position as masters of technology starts to seem more precarious.
Is smart making us dumb?
Evgeny Morozov, WSJ (2013)

Many smart technologies are heading in another, more disturbing direction. A number of thinkers
in Silicon Valley see these technologies as a way not just to give consumers new products that
they want but to push them to behave better. Sometimes this will be a nudge; sometimes it will
be a shove. But the central idea is clear: social engineering disguised as product engineering.
there is reason to worry about this approaching revolution. As smart technologies become more
intrusive, they risk undermining our autonomy by suppressing behaviors that someone
somewhere has deemed undesirable. Smart forks inform us that we are eating too fast. Smart
toothbrushes urge us to spend more time brushing our teeth. Smart sensors in our cars can tell if
we drive too fast or brake too suddenly. These devices can give us useful feedback, but they can
also share everything they know about our habits with institutions whose interests are not
identical with our own. Insurance companies already offer significant discounts to drivers who
agree to install smart sensors in order to monitor their driving habits. How long will it be before
customers can't get auto insurance without surrendering to such surveillance? And how long will
it be before the self-tracking of our health (weight, diet, steps taken in a day) graduates from
being a recreational novelty to a virtual requirement?

More effects of information and stimulus overload


David Shenk, Data Smog (1997)

No matter how creatively we name it, however, the effects of information overload do not add up
to one single debilitating syndrome that we can easily highlight, recoil in horror from, and muster
a simple defense against. A careful review of thirty years of psychological research reveals a
wide variety of effects from information and stimulus overload.
Increased cardiovascular stress. Blood pressure rises, leading to strain on the heart and
other organs. (Ettema)
Weakened vision. Researchers in Japan have documented an alarming decline in visual acuity
as a result of increased exposure to screens. Based on recent trends, a prediction is made that at
some point in the not-too-distant future, virtually everyone in Japan will be nearsighted.
(Ishikawa)
Confusion. consumers [are] unable to effectively and efficiently process the information.
(Malhotra, Journal of Consumer Research)
Frustration. loud speech [and other background noise] lowers frustration tolerance and
cognitive complexity (Rotten et al., Journal of Applied Psychology)
Impaired judgment. as Information load increases, integrated decision making first
increases, reaches an optimumand then decreases (Streufert et al., Journal of Experimental
Social Psychology)
Decreased benevolence. a persons response to someone needing assistance decreases in
likelihood as his environment increases its input bombardment (Korte et al., Journal of
Personality and Social Psychology)

Overconfidence. as people were given more information, confidence in their judgments


increased, but accuracy did not (Stewart et al., Organizational Behavior and Human Decision
Processes)
Deteriorating eyesight
John Freeman, The Tyranny of E-Mail (2009)
Before electric light, reading meant sitting by a window or in a room open to sunbeams, or near
a candle after dark. Read outside on a park bench in decent weather, and you will realize how
natural this feels. The eye is designed for this kind of light, and our chemical response to it
regulates our sleep and our moods, gives our days a natural rhythm. Electric light did not change
this equation fundamentally. A bank employee might have to read ledgers under a harsher light,
a reporter might sit and type a story before a single bulb, but the light they worked by was still
reflected, the light glancing down onto the page and bouncing back up into their eyes, at which
point the mind can begin to process whats on the page. The computer screen, however, is an
entirely new reading experience. Rather than bouncing down off a surface, light is shot directly
into our eyes. It is beamed right into our pupils, and our eyeballs get drier and drier as our blink
rate decreases. In the days when computers were used for just word processing, this was not an
overwhelming burden. Back then we still read the news and memos and mail in print, by
reflected light. All day long, light is being beamed into our eyes. Not surprisingly, this
accelerating change in how we read has enormous physical and behavioral consequences.
Eyesight has deteriorated with the ages, but it has taken a large leap back during the
computer age due to the fact that people spend big chunks of their day focusing on a
screen that is two feet in front of their faces. There are even nearsightedness epidemics
among children. In Singapore, for instance, 80 percent of children are myopic, up from 25
percent just 30 years ago.
No Phones in Bed
James Hamblin, The Atlantic (2014)
Charles Czeisler, a professor of sleep medicine at Harvard Medical School, found that around 90
percent of Americans use some kind of electronic device within the hour before bedand
correlated the degree of use with ever-poorer sleepone of his first theories of the case was
overstimulation. That's because, Czeisler and colleagues wrote in the Journal of Clinical Sleep
Medicine in December of 2013, "In addition to making phone calls, cell phones now allow the
user to instant message, listen to music, send emails, play games, and surf the Internet." So
they do. And all of that stimulation, the researchers proposed, may "impede the natural
withdrawal of sympathetic nervous system activity necessary for sleep onset."
Losing the ability to write without a computer
Steven Johnson, Interface Culture: How New Technology Transforms the Way We Create (1999)
I cant imagine writing without a computer. Even jotting down a note with pen and paper feels
strainedI have to think about writing, think about it consciously as my hand scratches out the
words on the page, think about the act itself. There is none of the easy flow of the word
processor, just a kind of drudgery, running against the thick grain of habit. Pen and paper feel
profoundly different to me now they have the air of an inferior technology about them, the sort
of contraption well suited for jotting down a phone number, but not much beyond that. Writing
an entire book by hand strikes me as being a little like filming Citizen Kane with a camcorder. You
can make a go at it, of course, but on some fundamental level youve misjudged the appropriate
scale of the technology youre using. It sounds appalling, I know, but there it is. Im a typer, not a
writer. Even my handwriting is disintegrating, becoming less and less my handwriting, and more
the erratic, anonymous scrawl of someone learning to write for the first time. I accept this
condition gladly, and at the same time I can recall the predigital years of my childhood, writing
stories by hand into loose---leaf notebooks, practicing my cursive strokes and then surveying the
loops and descenders, seeing something there that looked like me, my sense of selfhood
scrawled onto the page.
Data Smog

David Shenk, Data Smog: Surviving the Information Glut (1997)


Let us call the unexpected, unwelcome part of our atmosphere "data smog," an
expression for the noxious muck and druck of the information age. Data smog gets in
the way; it crowds out quiet moments, and obstructs much-needed contemplation. It spoils
conversation, literature, and even entertainment. It thwarts skepticism, rendering us less
sophisticated as consumers and citizens. It stresses us out. Data smog is not just the pile of
unsolicited catalogs and spam arriving daily in our home and electronic mailboxes. It is also
information that we pay handsomely for, that we crave--the seductive, mesmerizing quick-cut
television ads and the twenty-four-hour up-to-the-minute news flashes. It is the faxes we request
as well as the ones we don't; it is the misdialed numbers and drippy sales calls we get during
dinnertime; but it is also the Web sites we eagerly visit before and after dinner, the pile of
magazines we pore through every month, and the dozens of channels we flip through whenever
we get a free moment. The blank spaces and silent moments in life are fast disappearing. Mostly
because we have asked for it, media is everywhere.
Information Gluttony
David Shenk, Data Smog: Surviving the Information Glut (1997)
Just as fat has replaced starvation as this nation's number one dietary concern, information
overload has replaced information scarcity as an important new emotional, social, and political
problem. "The real issue for future technology," says Columbia's Eli Noam, "does not appear to
be production of information, and certainly not transmission. Almost anybody can add
information. The difficult question is how to reduce it."
Signal-to-noise ratio
David Shenk, Data Smog: Surviving the Information Glut (1997)
Audio buffs have long been familiar with the phrase signal-to-noise ratio. It is engineering
parlance for measuring the quality of a sound system by comparing the amount of desired audio
signal to the amount of unwanted noise leaking through. In the information age, signal-to-noise
has also become a useful way to think about social health and stability. How much of the
information in our midst is useful, and how much of it gets in the way? What is our signal-tonoise ratio? We know that the ratio has diminished of late, and that the character of information
has changed: As we have accrued more and more of it, information has emerged not only as
a currency, but also as a pollutant.
"Filter Failure"
Siva Vaidhyanathan, The Googlization of Everything (and why we should worry) (2011)
From childhood onward, we have usually allowed others to process the information we receive
to filter it. As technology writer Clay Shirky argues, what we think is information overload
is actually a function of filter failure. When we feel overwhelmed by the quantity of news
and information we encounter, its a sign that we have just not figured out how to manage our
flows of information. Concentration, mental discipline, and time management count as filters. So
does Google.
Learning or "encapsulated entertainment"?
Mark Bauerlein, The Dumbest Generation (2008)
But what evidence do we have that the world has dilated, that the human mind reaches
so much further than it did just a decade or two ago? The visionary rhetoric goes on, but
with surveys producing one embarrassing finding after another, with reading scores flat,
employers complaining about the writing skills of new hires as loudly as ever, college
students majoring in math a rarity, remedial course attendance on the rise, and young
people worrying less and less about not knowing the basics of history, civics, science, and
the arts, the evidence against it can no longer be ignored. We should heed informed
skeptics such as Bill Joy, cofounder of Sun Microsystems, who argues Im skeptical
that any of this has anything to do with learning. It sounds like its a lot of
encapsulated entertainment. . . . This all, for me, for high school students sounds like
a gigantic waste of time. If I was competing with the United States, I would love to have
the students Im competing with spending their time on this kind of crap.

Side Effects
William Davidow, Overconnected: The Promise and Threat of the Internet (2011)
The automobile, which gave us the freedom to go where we wanted when we wanted, which
created the suburbs, served as the backbone of much of the prosperity boom of the twentieth
century. When it was first invented, no one foresaw its grim side effects urban sprawl, long
commutes, dependence on unstable political regimes for fossil fuels, pollution, and hollowed-out
cities. And now we have the Internet, whose side effects we are experiencing like
nothing else in the past.
We are becoming mentally obese
Sam Anderson, New York Magazine (2009)
http://nymag.com/news/features/56793/
Our attention crisis is already chewing its hyperactive way through the very foundations of
Western civilization. Google is making us stupid, multitasking is draining our souls, and the
dumbest generation is leading us into a dark age of bookless power browsing. Adopting
the Internet as the hub of our work, play, and commerce has been the intellectual
equivalent of adopting corn syrup as the center of our national diet, and weve all
become mentally obese.

In-class resources for Tue Oct 6 (class 5)


Is technology making us dumber?
Digerati video and reading
(discussion leader: x)
Digerati
Nicholas Carr
@roughtype
http://www.nicholascarr.com
Video
We will watch and discuss this in class.
Nicholas Carr on Colbert Report (4:34)
http://thecolbertreport.cc.com/videos/85xlkw/nicholas-carr
Reading
We will read and discuss this in class.
Thesis in tweetform
(Nicholas Carr, roughtype.com, 2012)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Technologys effect on our attention span
Technologys effect on reading
Counter-arguments
Technologys effect on sleep
Nicholas Carr
Aaron Swartz

Thesis in Tweetform
Nicholas Carr (roughtype.com, 2012)

1. The complexity of the medium is inversely proportional to the eloquence


of the message.
2. Hypertext is a more conservative medium than text.
3. The best medium for the nonlinear narrative is the linear page.
4. Twitter is a more ruminative medium than Facebook.
5. The introduction of digital tools has never improved the quality of an art
form.
6. The returns on interactivity quickly turn negative.
7. In the material world, doing is knowing; in media, the opposite is often
true.
8. Facebooks profitability is directly tied to the shallowness of its
members: hence its strategy.
9. Increasing the intelligence of a network tends to decrease the
intelligence of those connected to it.
10. The one new art form spawned by the computer the videogame is
the computers prisoner.
11. Personal correspondence grows less interesting as the speed of its
delivery quickens.
12. Programmers are the unacknowledged legislators of the world.
13. The album cover turned out to be indispensable to popular music.
14. The pursuit of followers on Twitter is an occupation of the bourgeoisie.
15. Abundance of information breeds delusions of knowledge among the
unwary.
16. No great work of literature could have been written in hypertext.
17. The philistine appears ideally suited to the role of cultural impresario
online.
18. Television became more interesting when people started paying for it.
19. Instagram shows us what a world without art looks like.

20. Online conversation is to oral conversation as a mask is to a face.


21. Recommendation engines are the best cure for hubris.
22. Vines would be better if they were one second shorter.
23. Hell is other selfies.
24. Twitter has revealed that brevity and verbosity are not always
antonyms.
25. Personalized ads provide a running critique of artificial intelligence.
26. Who you are is what you do between notifications.
27. Online is to offline as a swimming pool is to a pond.
28. People in love leave the sparsest data trails.
29. YouTube fan videos are the living fossils of the original web.
30. Mark Zuckerberg is the Grigory Potemkin of our time.
31. Every point on the internet is a center of the internet.
32. On Twitter, ones sense of solipsism intensifies as ones follower count
grows.
33. A thing contains infinitely more information than its image.
34. A book has many pages; an ebook has one page.
35. If a hard drive is a soul, the cloud is the oversoul.
36. A self-driving car is a contradiction in terms.
37. The essence of an event is the ghost in the recording.
38. A Snapchat message becomes legible as it vanishes.
39. When we turn on a GPS system, we become cargo.

40. Google searches us.

Homework for Thu Oct 8 (class 6)


Internet addiction
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Exploring the Neuroscience of Internet Addiction
Bill Davidow, theatlantic.com (2012)
Virtually You
(Elias Aboujaoude, 2013)
Excerpts: Addiction and Narcissism
Video
Watch the assigned video on Zaption and post a one-sentence response.
NYT: China's Web Junkies (7:12)
http://zapt.io/tz9vjvym
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
What causes video game addiction? (change?)
link

Exploring the Neuroscience of Internet Addiction


Bill Davidow, theatlantic.com (2012)
Much of what we do online releases dopamine into the brain's pleasure centers, resulting in
obsessive pleasure-seeking behavior. Technology companies face the option to exploit our
addictions for profit.
The leaders of Internet companies face an interesting, if also morally questionable, imperative:
either they hijack neuroscience to gain market share and make large profits, or they let
competitors do that and run away with the market.
In the Industrial Age, Thomas Edison famously said, "I find out what the world needs. Then I go
ahead and try to invent." In the Internet Age, more and more companies live by the mantra
"create an obsession, then exploit it." Gaming companies talk openly about creating a
"compulsion loop," which works roughly as follows: the player plays the game; the player
achieves the goal; the player is awarded new content; which causes the player to want to
continue playing with the new content and re-enter the loop.
It's not quite that simple. Thanks to neuroscience, we're beginning to understand that achieving
a goal or anticipating the reward of new content for completing a task can excite the neurons in
the ventral tegmental area of the midbrain, which releases the neurotransmitter dopamine into
the brain's pleasure centers. This in turn causes the experience to be perceived as pleasurable.
As a result, some people can become obsessed with these pleasure-seeking experiences and
engage in compulsive behavior such as a need to keep playing a game, constantly check email,
or compulsively gamble online. A recent Newsweek cover story described some of the harmful
effects of being trapped in the compulsion loop.
The release of dopamine forms the basis for nicotine, cocaine, and gambling addictions. The
inhalation of nicotine triggers a small dopamine release, and a smoker quickly becomes
addicted. Cocaine and heroin deliver bigger dopamine jolts, and are even more destructive.
In the past, companies used customer surveys, focus groups, interviews, and psychological tests
to figure out how to make products more appealing to customers. In 1957, Vance Packard
published The Hidden Persuaders, in which he identified eight hidden needs -- including a
consumer's desire to love and be loved, or a yearning for power -- which advertisers could
exploit to create demand for their products.
Packard, who questioned the morality of exploiting emotions in order to sell products, died in
1996. Were he alive today, he would surely be shocked to see how primitive the exploitation
techniques he described now seem.
Today we can monitor the brain's response with NMR (nuclear magnetic resonance) imaging to
more accurately measure what people are experiencing when they play online games, interact
with smart devices, or gamble. Luke Clark, a neuroscientist at the University of Cambridge, used
brain scans to determine that when gamblers felt they could exert control over a game's
outcome -- for example, by throwing the dice harder, or pulling the lever on a slot machine with
more force -- it increased their interest in playing. Also, such near misses as getting two out of
three matching symbols on a slot machine stimulated the desire to continue to play. Other
experiments have shown that optimizing a slot machine's frequency of near misses can extend
gambling times by 30 percent. Neuroscientists have also found that it is the unpredictability of
winning large rewards that stimulates the dopamine releases that compels gamblers to return.
In the 1990s, concern over obsessive-compulsive behavior associated with computer games and
the Internet began to grow. Until roughly 2000, compulsive behavior remained a side effect -- not
an intentional element of game design and other Internet applications. Application providers
were simply supplying customers with services that made their products more appealing.
But before long, people were referring to their BlackBerries as CrackBerries, and parents were
beginning to worry about the number of hours their kids spent on video games. We now believe
that the compulsion to continually check email, stock prices, and sporting scores on

smartphones is driven in some cases by dopamine releases that occur in anticipation of


receiving good news. Indeed, we have grown so addicted to our smartphones that we now
experience "phantom smartphone buzzing," which tricks our brains into thinking our phone is
vibrating when it isn't.
By the time Web 2.0 rolled around, the key to success was to create obsessions. Internet gaming
companies now openly discuss compulsion loops that directly result in obsessions, and the goal
of other applications is the same: to create the compulsion to gather thousands of friends on
Facebook, thousands of followers on Twitter, or be pleasantly surprised to discover from
Foursquare that a friend you haven't seen for years is nearby.
In the past, society has been able to put physical barriers in place to make it more difficult to
satisfy unhealthy obsessions. For example, gambling casinos were primarily segregated in
Nevada. Things are very different today. In the first place, there is no physical barrier between
people and the obsession in question. Smartphones and portable electronic devices travel with
us in our pockets.
When compulsive behavior undermines our ability to function normally, it enters the realm of
obsessive-compulsive disorder. By some estimates around 2 to 4 percent of serious gamblers are
addicted, and some 10 percent (it may be less or more since most people under-report
addiction ) of Internet users have become so obsessed with the Internet that its use is
undermining their social relationships, their family life and marriage, and their effectiveness at
work. As the performance of Internet-connected devices improves, and as companies learn how
to use neuroscience to make virtual environments more appealing, that number will undoubtedly
increase.
Many Internet companies are learning what the tobacco industry has long known -- addiction is
good for business. There is little doubt that by applying current neuroscience techniques we will
be able to create ever-more-compelling obsessions in the virtual world.
There is, of course, no simple solution to this problem. The answer starts with recognizing that
our virtual environment has very real consequences. For my own part, I create physical walls
around my virtual environment. I will read books and newspapers anywhere in my home on my
iPad, but I answer emails only in my office. When I am talking with my wife, listening to my
daughters discuss the challenges they face in raising their children, or playing and laughing with
my grandsons, I not only shut off my iPhone, I put it out of reach.
I'm learning that to function effectively and happily in an increasingly virtual world, I have to
commit a significant amount to time to living without it.

If you were to invent a medium to re-wire our circuits...


Nicholas Carr, (2010)
One thing is very clear: if, knowing what we know today about the brains plasticity, you
were to set out to invent a medium that would rewire our mental circuits as quickly
and thoroughly as possible, you would probably end up designing something that
looks and works a lot like the Internet. Its not just that we tend to use the Net regularly,
even obsessively. Its that the Net delivers precisely the kind of sensory and cognitive stimuli
repetitive, intensive, interactive, addictivethat have been shown to result in strong and rapid
alterations in brain circuits and functions. With the exception of alphabets and number systems,
the Net may well be the single most powerful mind-altering technology that has ever come into
general use.
We are wired to crave instant gratification
Elizabeth Cohen, cnn.com (2011)
Dr. Nora Volkow, director of the National Institute on Drug Abuse, admits she, too, has a hard
time resisting the call of her BlackBerry. "On vacation, I look at it even though I don't need to,"
she says. "Or I take a walk with my husband and I can't resist the urge to check my e-mail. I feel
guilty, but I do it." She explains that constant stimulation can activate dopamine cells in
the nucleus accumbens, a main pleasure center of the brain.
Is Behaviorism to Blame?
Sam Anderson, New York Magazine (2009)
Im not ready to blame my restless attention entirely on a faulty willpower. Some of it is pure
impersonal behaviorism. The Internet is basically a Skinner box engineered to tap right
into our deepest mechanisms of addiction. As B.F. Skinners army of lever-pressing rats and
pigeons taught us, the most irresistible reward schedule is not, counterintuitively, the one in
which were rewarded constantly but something called variable ratio schedule, in which the
rewards arrive at random. And that randomness is practically the Internets defining feature: It
dispenses its never-ending little shots of positivitya life-changing e-mail here, a funny YouTube
video therein gloriously unpredictable cycles.
Is technology the new opiate of the masses?
Dr. Jim Taylor, Seattle PI (2012)
Karl Marx famously called religion the opiate of the masses. Well, to paraphrase
Reggie Hammond, Eddie Murphys character in the film 48 Hours, Theres a new opiate in town
and its name is technology. Yes, folks, everywhere you look these days, you see people
shooting up their technological drug of choice, whether emails, text messages, Twitter or
Facebook feeds, YouTube videos, streaming movies and TV shows, or playing app games on their
smartphones. Concerns about this drug have been gaining increasing attention in recent years.
The words Internet and addiction have become conjoined and are now a part of our technology
lexicon (usually by people who say it dismissively with a smirk as they ingest this drug through
their favorite delivery system, whether computer, tablet, or smartphone). A 2010 survey found
that 61% of Americans (the number is higher among young people) say they are addicted to the
Internet. Another survey reported that addicted was the word most commonly used
by people to describe their relationship to technology.
Unable to log off?
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
Others join the discussion of trouble. One says, Facebook has taken over my life. She is
unable to log off. So, she says, I find myself looking at random peoples photos, or going to
random things. Then I realize after that it was a waste of time. A second says she is afraid she
will miss something and cannot put down her phone. Also, it has a camera. It has the time. I
can always be with my friends. Not having your phone is a high level of stress. A third sums up
all she has heard: Technology is bad because people are not as strong as its pull.

We Are Digital Maximalists


William Powers, Hamlets Blackberry: Building a Good Life in the Digital Age (2010)
Right now, in these early years of the digital era, without even realizing it, were living by
a very particular philosophy of technology. It can summarized in a sentence:
Its good to be connected, and its bad to be disconnected. This is a simple idea but one
with enormous implications. Once you assume that its a good thing to be connected through
digital networks and a bad thing to be disconnected from them, it becomes very clear how to
organize your screen time and, indeed, every waking hour. If digital connectedness is intrinsically
good, it follows
that one should try as hard as possible to stay connected at all times or, to put it another way,
avoid being disconnected. Thus, our philosophy has two corollaries:
First corollary: The more you connect, the better off you are.
Second corollary: The more you disconnect, the worse off you are.
We are digital maximalists.
On Narcissism
John Freeman, The Tyranny of Email (2009)
Social media has made us, in a sense, narcissists. In his essay On Narcissism, Sigmund Freud
proposed two ideas of the narcissist: one revolved around the concept of self-love, the other
stemmed from a state of mind that has no awareness that the self and other exist. McLuhan
used the Greek myth of Narcissus to explain why people become entranced by tech gadgets.
Narcissus is the youth who sees his own reflection in the water and mistakes it for somebody
else. The point of this myth, McLuhan wrote, is that men at once become fascinated by any
extension of themselves in any material other than themselves. Similarly, he said, were
fascinated by new technologies because they project us beyond ourselves. But just like
Narcissus, we dont recognize that thats what the gadget is doing, projecting us, by extending
our bodies into the world. The confusion induces a kind of trance. We cant take our eyes off it,
but we dont understand why.

Virtually you
William Saletan, New York Times (2011)
Humanity is migrating to cyberspace. In the past five years, Americans have doubled the hours
they spend online, exceeding their television time and more than tripling the time they spend
reading newspapers or magazines. Most now play computer or video games regularly, about 13
hours a week on average. By age 21, the average young American has spent at least three times
as many hours playing virtual games as reading. It took humankind eight years to spend 100
million hours building Wikipedia. We now spend at least 200 million hours a week playing World
of Warcraft.
Elias Aboujaoude, a Silicon Valley psychiatrist, finds this alarming. In Virtually You, he argues
that the Internet is unleashing our worst instincts. It connects you to whatever you want:
gambling, overspending, sex with strangers. It speeds transactions, facilitating impulse
purchases and luring you away from the difficulties of real life. It lets you customize your
fantasies and select a date from millions of profiles, sapping your patience for imperfect
partners. It lets you pick congenial news sources and avoid contrary views and information. It
conceals your identity, freeing you to be vitriolic or dishonest. It shields you from detection and
disapproval, emboldening you to download test answers and term papers. It hides the pain of
others, liberating your cruelty in games and forums. It rewards self-promotion on blogs and
Facebook. It teaches you how to induce bulimic vomiting or kill yourself.
In short, everything you thought was good about the Internet information, access,
personalization is bad. Aboujaoude isnt shy in his indictment. He links the Internet to
consumer debt, the housing crash, eating disorders, sexually transmitted infections,
psychopathy, racism, terrorism, child sexual abuse, suicide and murder. Everything online
worries him: ads, hyperlinks, even emoticons. The Internet makes us too quarrelsome. It makes
us too like-minded. It makes us work too little. It makes us work too much.
In part, this grim view stems from Aboujaoudes work. He sees patients with online compulsions.
He believes in the Freudian id a shadowy swirl of infantile impulses and perceives its
modern incarnation in what he calls the e-personality, a parallel identity that hijacks your mind
online. In the physical world, your superego restrains your id. But in the virtual world, where you
can instantly fulfill your whims, the narcissism and grandiosity of the e-personality run wild.
To Aboujaoude, the Internet is a mechanical alien, a new type of machine . . . that can efficiently
prey on our basic instincts. It converts children into bullies almost automatically. It turned
Philip Markoff, the accused Craigslist killer, who committed suicide in jail, into a serial
assailant. Lori Drew, the woman whose online impersonation of a teenage boy supposedly drove
a girl to suicide, seemed normal until the Internet made her fleeting dark wish . . . take on a life
of its own. Again and again, computers get the blame.
Jane McGonigal, the author of Reality Is Broken, sees the Internet differently. Shes a game
designer. To her, the virtual world isnt a foreign contraption. Its our own evolving creation. She
agrees that bad online games can addict people, make them belligerent, distract them from
reality and leave them empty. But this is our fault, not the Internets. When virtual life brings out
the worst in us, redesign it.
If Aboujaoude is the Internets Hobbes, McGonigal is its Rousseau. In the rise of multiplayer
games, she sees a happier picture of human nature a thirst for community, a craving for hard
work and a love of rules. This, she argues, is the essence of games: rules, a challenge and a
shared objective. The trick is to design games that reward good behavior. The Internets
unprecedented power, its ability to envelop and interact with us, is a blessing, not a threat. We
can build worlds in which nice guys finish first.
The point isnt just to enhance virtual reality. Its to fix the real world, too. McGonigal offers
several examples, some of which she helped create. Chore Wars, an alternate-reality game,
builds positive attitudes toward housework by rewarding virtual housework. Cruel 2 B Kind
invites players to kill competitors with smiles or compliments. The Extraordinaries hands out
missions like one in which the player must GPS-tag a defibrillator so its location can be registered

for later use. Groundcrew assigns players to help people with transportation, shopping or
housekeeping.
The premise is that since games motivate us more effectively than real life, making them
altruistic and bringing them into the physical world will promote altruistic behavior. But is this
motivating power transferable? What draws us to virtual worlds, McGonigal notes, is their
carefully designed pleasures and thrilling challenges customized to our strengths. Theyre
never boring. They let us choose our missions and control our work flow. They make us feel
powerful. They offer a guarantee of productivity in every quest. And when we fail, they make
our failure entertaining.
Reality doesnt work this way. Floors need scrubbing. Garbage needs hauling. Invalids need their
bedpans washed. This work isnt designed for your pleasure or stimulation. It just needs to be
done.
McGonigal points to studies suggesting that games that reward socially constructive behavior
promote such behavior in real life. But the only outputs measured by these studies are selfreported values, self-reported behavior in the real world, and objectively measured behavior in
games. Wheres the reliable evidence that this data translates to peoples doing more real work?
Projects like Groundcrew, McGonigal concedes, have produced modest if any results so far.
Hundreds of thousands of people play Free Rice, a game designed to feed the hungry, but the
rice comes from advertisers, not players. Thousands sign up every day for Folding@home, a
game to cure diseases, but all these players contribute is processing power on game consoles.
If reality is inherently less attractive than games, then the virtual world wont save the physical
world. It will empty it. Millions of gamers, in McGonigals words, are opting out of the bummer
of real life. And they arent coming back. Halo 3, for example, has become a complete virtual
world, with its own history documented in an online museum and Ken Burns-style videos.
McGonigal calls this war game a model for inspiring mass cooperation. Two years ago, its 15
million players reached a long-sought objective: They killed their 10 billionth alien. Fresh off one
collective achievement, Halo players were ready to tackle an even more monumental goal,
McGonigal writes. And what goal did they choose? Feeding the hungry? Clothing the poor? No.
The new goal was to kill 100 billion aliens.
Game designers cant be counted on to arrest this trend. McGonigal says the game industry
wants to help users avoid addiction so that theyll remain functional and keep buying its
products. But weve heard that argument before from the tobacco industry. Addiction, as a
business model, is too addictive to give up. She says Foursquare, a game that rewards you for
going out with friends and checking in at restaurants, promotes sociability. That would be nice,
but the games Web site devotes a whole section (Foursquare for Business) to commercial
exploitation.

The Internet isnt heaven. It isnt hell, either. Its just another new world. Like other worlds, it can
be civilized. It will need rules, monitoring and benevolent designers who understand the flaws of
its inhabitants. If Aboujaoude is right about our weakness for virtual vice, well need all the
McGonigals we can get.From: Virtually You: The Dangerous Powers of the e-Personality
(Elias Aboujaoude, 2012)

Dr. Aboujaoude (preface)


I treated these patients in my Impulse Control Disorders Clinic. Impulse control disorders fall
within the obsessive-compulsive spectrum of psychological conditions. They share with OCD
repetitive, anxious thoughts that intrude on the persons mind (the obsession equivalent) and
ritualized behaviors that calm the anxiety down (the compulsion equivalent). Unlike OCD, which
is never pleasurable, impulse control disorders are often thrilling, even euphoria-inducing in the
moment, but lead to much distress and remorse in the long run. While I had some experience
diagnosing and treating real-life impulse control disorders, these cases came with a new twist to
their age-old problems: Their impulsivity seemed exclusively virtual.
Ashley
Ashley, a sixteen-old ballet student, was brought in by her mother to see me for a consultation.
Ashley had already decided to prescribe to herself the antidepressant Wellbutrin, ostensibly for
depression, and was on the verge of ordering it online from an overseas supplier when her
mother insisted on making the appointment. Im sure Wellbutrin would help, she told her
daughter, avoiding a fight that might have made Ashley reluctant to see a doctor. But lets at
least have a specialist prescribe it. This way we know were getting the real thing.
Ashleys mother, a medical social worker, had used her maternal and clinical instincts to
accurately diagnose her childs problem: not depression but a serious case of anorexia nervosa
that required immediate medical attention. Objective signs for the disorder abounded: the smell
of vomit in the bathroom shortly after the familys meals, the weight loss and accompanying
arrest in menstruation, the obsessive calorie counting, and the number of bookmarked nutrition
Web sites on the family computer.
It didnt take long during my first meeting with Ashley to agree with her mother that she was not
depressed. Ashley was social, enjoyed ballet, was doing relatively well in school, and smiled
broadly when a text message from a friend arrived to her cell phone during the meeting. Yet she
quickly blurted out, Insomnia, poor concentration, poor energy, and anhedonia in describing to
me how she was doing. When I asked her to explain what she meant by anhedonia, a formal
clinical term with a Greek pedigree, Ashley grew nervous and impatient. It was obvious that
whatever Web site she had consulted to memorize the symptoms of clinical depression
apparently did not explain what anhedonia was. (She could have looked up the term on
Wikipedia, where it is accurately defined as the inability to experience pleasure from normally
pleasurable life events such as eating, exercise, and social or sexual interaction.) I told Ashley
that given her very low weight Wellbutrin was a poor choice for antidepressant, as it could cause
serious seizures in underweight people and might lead to further weight loss. That had little
effect on Ashley, however, who threatened to buy the medication on the Web if I did not write
her a prescription for something she knew would work for her.
Wellbutrin, I had learned from other patients with anorexia, had a reputation in online eating
disorders circles for being an appetite suppressant and is commonly sought out by anorexics to
help them lose weight. Just as Ashley learned online how to more effectively induce vomiting
(stick the bottom of a toothbrush down your throat, itll all come out, according to
funadvice.com), she also learned how to manipulate her mother and doctor into prescribing the
last medication she needed. Still, I did not completely close the door on the possibility of
prescribing this medication at some point in the future for Ashley, in part to keep her in my clinic
and to encourage her to join an eating disorders therapy group. Unfortunately, Ashley barely
lasted for two meetings before the psychologist running the group, a colleague of mine, had to
ask her to leave. She could not handle Ashleys constant defiance and the distorted, partial
knowledge clearly gleaned online that Ashley would use to refute every fact the group leader
was trying to communicatehow, for instance, the human race did not evolve to eat 2,000

calories a day; how the research showing that the brain of anorexics shrinks as the disease
progresses was seriously flawed because it didnt include a control group and how most
religions of the world have incorporated some element of fasting into their rituals, so how bad
can it be.

Liz
Liz, a forty-four-year-old homemaker with no previous psychiatric problems, came to me for help
with out-of-control e-tail expenditures. She had heard a radio ad for a clinical trial we were doing
at Stanford University to test Celexa, a serotonin-based antidepressant, in the treatment of
compulsive shoppers. She hoped she would qualify to be in the study. For every dress I like, I
have to buy three, each in a different color, she told me at our initial meeting, sounding puzzled
at her own behavior. Then, for each color I get, I have to buy two additional sizeswhat if I gain
weight, what if I lose weight? Meanwhile, Lizs brick-and-mortar shopping habits remained
relatively reasonable. She made it a point to tell me how she had just missed the annual Labor
Day weekend sale at the big furniture outlet near where she livesan event she looked forward
to year after year. I didnt think I could afford it. Not this year. Not with my online spending, she
explained. Then, sounding almost nostalgic for her days of responsible shopping, she added, I
never lost control at a real store the way I lose control online. Until I discovered Buy.com, I was
actually putting money aside for retirement. What is it about me?
Or what is it about the Internet? I encouraged Liz to ask herself this question and to explore how
the medium itself might be contributing to her behavior. The idea seemed to resonate. I guess if
its online, somehow it doesnt feel real, she said. Or not as real. Its innocent and fun. Almost
guilt-freejust like a computer game. And how bad can a computer game be? Quite bad,
unfortunately, as Liz discovered. Her online shopping sprees had already caused her to file for
bankruptcy, which is what finally prompted her to seek help in my clinic. Although her problem
was clearly shopping-based, it wasnt the traditional compulsive buying disorder our study was
recruiting for. Instead of those crowded modern-day cathedrals we call malls, Lizs problem
manifested itself solely in virtual joints that are always open and where parking is never a
problem. Most of our experience with compulsive shopping, and most of what had been
described in the psychiatric literature, involved real acquisitions from real stores. Our screening
questionnaire for the study, which included questions about the role of in-store advertising, the
effect of product display, whether she shopped alone or with friends, and how much effort was
involved in getting to her favorite storesall seemed irrelevant and fell woefully short of
capturing her problem.

Richard
For Richard, a thirty-six-year-old married Human Resources specialist and father of two, it was
the threat of divorce that made him make his first appointment. For years, his gambling habits
were what one might call social, not remarkably different from what his coworkers would
describe in conversations around the water cooler on Monday mornings. Like them, Richard
enjoyed the occasional weekend skiing trip to Reno, a three-hour drive from home, where he and
his wife would sometimes play the slot machines in their hotel lobby well into the night. Beyond
that, however, he never sought it out and never stopped to try his luck at the local Indian casino,
despite a couple of memorable wins in Reno that might have served to draw him back to a
gaming institution. Simply put, when it came to gambling, Richard could take it or leave it.
A single spam e-mail in his in-box, however, led Richard to a virtual casino that became his
undoing. It started with an offer for a free trial of an online Texas Hold em game that Richard
took advantage of one night when he was having trouble falling asleep. Before long, he was
waiting for his wife to go to bed in order to log on, eventually using her credit card to do so after
exhausting his promotional account and maxing out his personal credit cards.
Intriguingly, on his morning drive to work, Richard would speed past the Indian casino, counting
the minutes till he could get to his desk and log on to his favorite gambling Web site. Its a very
different experience than being in a real casino, he explained to me, opening his wallet to show
unused coupons he had received from the Reno resort where he had won big. See, theyre
always mailing me offers for a free stay, free meals, free shows, he said. None of it seems to

move me. Somehow it feels better online. Youre free of inhibitions, whether theyre your own or
imposed on you by other people. Its just you and your computer screen, with no one to
disapprove of you or give you dirty looks, and no one to remind you of your responsibilities and
your credit card debt. Richards behavior caught up with him when reality, in the form of his
wife leaving him with their two kids, encroached on his virtual life.
Raf
RAFFI WAS A forty-year-old married man who came to see me for what he described as selfesteem issues. As the only son of first-generation Italian immigrants, he grew up in a religious
household feeling pampered and doted on. The altar boy who did well in school and excelled in
sports became a successful civil engineer, marrying his high school sweetheart and forming a
family that also included two beautiful teenage daughters. The couple of years before our first
appointment, however, had been very trying. Two years ago, at the age of thirteen, one daughter
was diagnosed with diabetes. Shortly after that, his beloved mother died unexpectedly of a heart
attack. And when the recession hit, Raffi was laid off as part of a restructuring of his company,
becoming unemployed for the first
time since finishing college. Multiple job interviews had led nowhere.
The string of family and professional losses happening in a relatively short period of time led to
clinical depression, which further compounded the situation by taking away Raffis energy and
his motivation to exercise, eat healthily, and take care of his physical appearance. The once
vibrant forty-year-old with the boyish looks and charmed life started seeing himself as fat,
unemployable, and all-around worthless.
His wife of fifteen years, still working and as striking and active as when they first fell in love
during their senior year, tried to be supportive. In his state of extreme vulnerability and low selfesteem, however, all Raffi could focus on was how she must be having an affair. Not that he had
any reason to suspect infidelity: Any extramarital activities his wife partook in seemed to focus
on attending diabetes support groups and investigating insulin pumps for her child. But she has
to be having an affair because I have nothing to offer, went his faulty logic. Just look at her and
look at me.
About ten years before this, in the old days, Raffi might have worked hard to convince himself
she wasnt. He may have sought a reality check from a friend or therapist. He may have tried
to seek reassurance from his wife that she still loved him and that she would stay by his side
until the dark clouds had passed and they were able to go back to their normal lives. If all
alternatives failed, he might, out of desperation and as a last resort, decide to hire a private eye.
But this was today, a time for shortcuts and immediate results, so Raffi decided to cut to the
chase and start with the desperate act right away. Only instead of a costly detective, he
installed keyloggers on his wifes laptop.
Keyloggers is a family of easily downloadable, relatively inexpensive software programs that
track (or log) the keys struck on a computer keyboard without the person using it knowing that
his actions are being monitored. They have some defendable applications: They can be used to
control kids Internet surfing habits and can help maintain an automatic backup for ones typed
data. Increasingly, however, their main use is as spying tools for people who want to snoop on
one another in personal and business affairsa way to extend ones knowledge of, and
potentially control over, the other person into the virtual realm.
It is now generally accepted that if there is dirt to be found in someones life, a good place to
start looking for it is in that persons e-mail or text message in-box. Only a clean in-box is proof
of a clean record, so Raffi prayed for a chaste log of e-mails that would exonerate his wife. He
quickly retrieved her passwords and, for six months, scrutinized her outgoing and incoming
messages. He studied her saved contacts and researched the ones he didnt recognize. He
visited every unfamiliar Web site she visited and joined (under a pseudonym) every community
she signed up for. But the only real secret he uncovered involved a surprise fortieth birthday
party she was planning for him behind his back.
Reassuring himself that no affair was taking place, however, hardly brought him relief. Raffis
problem may have started with negative-territory self-esteem, but he was now suffering from

overwhelming guilt over the sting operation he had conducted against his loving and faithful
wife. How could he not trust her, and how did it become so easy for him to turn into a spy? For
six whole months, he had spied on a woman who had given him no real reason for doubt. What
he had done did not fit the mental image he had of himself, his wife, and his marriage, and Raffi
could not forgive himself for it.

Jill and Tom


Fraud should be the last thing that comes to my mind when thinking of my patient Jill, but there
is no way around the deceitfulness that marked her foray into online dating. A highly intelligent,
always conservatively dressed English teacher, Jill suffered from a severe social anxiety problem
that had crippled her romantic life and survived many attempts at medication treatment and
psychotherapy. With Jill still celibate at twenty-nine, her previous psychiatrist wisely
recommended online dating as a good way to confront the problem. His idea was for her to
break the ice through a gradual, nonthreatening e-mail and picture exchange, which would
ensure that a basic attraction was present and fundamental compatibilities were met. Such an
approach would minimize the possibility of rejection or a significant personality clash at the first
meeting, along with the overwhelming anxiety this would generate for someone with social
phobia. That is how, armed with an online dating service subscription and almost-thirty
desperation, Jill finally met Tom.
Actually, Jill sort of met Tom. Her online persona, the one that Tom started courting, was an
exalted version of the shy, inhibited woman sitting in my office. For starters, she called herself
Tess, changing her name in part to protect confidentiality. However, well beyond the name
change, Jill felt a strange pull to embellish, or rather reinvent, other aspects of who she was.
Instead of an English teachertoo boring and drab, she thoughtJill became a sales rep who
used a combination of natural gregariousness, which she did not possess, and borderlineprovocative dress, which I could not imagine her wearing, to convince architects to try the new
line of high-end Italian furniture she was promoting. She even intentionally dumbed down her
syntaxa sacred cow for her in the classroomand opted instead to communicate with Tom in
simple, declarative sentences and plenty of emoticons. (They make emotions much easier, she
explained.) Gone from her e-mails, then, was the SAT-style vocabulary she modeled for her
students, and in came playful monosyllabic words and truncated forms (cuz for because, pic for
picture, u for you, hugz for hugs). Coming across as too brainy was the kiss of death, Jill thought,
even if she was communicating with a doctor-in-training like Tom.
This online version of Jill appealed to Tom, and over a six-month e-mail courtship, something that
can only be called love (or maybe luv) developed between the two: an irrational need to check
in with the other on an exhaustingly frequent basis, for no other reason than to make sure the
person is still desired with the same intensity expressed in the last e-mail, sent only ten minutes
before. Along with this need, of course, came internal restlessness and agitation if the e-mail,
text, or instant messaging ping did not arrive when expected. All in all, however, this was
thrilling restlessness and agitation, and a completely novel experience that Jill would cherish in
her mind as a much delayed first romance, finally.
But then Jill turned thirty, and online love suddenly took on a juvenile air she could no longer
tolerate. The cuzes in her e-mails became embarrassments for the schoolteacher, and the lols
(for laughing out loud) stopped being funny. Utilizing big, difficult-to-truncate descriptors like
inane and asinine, Jill described to me the discomfort, even shame, that she started feeling
with her online love story. She stopped seeing her psychiatrist, saying she was turning into
somebody she didnt recognize under his care, and wanted to start seeing me instead. She still
loved Tom, however, and my role was to help her transpose her online relationship into the real
world, because I dont want to give up who I am. I tried to explain to her how this was not an
easy task, how I had no special psychotherapeutic skills to help her carry it out safely and
successfully, and how she should consider going back to her former psychiatrist, who knew her
better and who had the right instinct when he recommended online dating, even if she ended up
taking it too farbut Jill would not listen or reason. She was a woman in virtual love.
As I quickly found out, what stoked Jills anxiety even more than entering the fourth decade of
life was that, several months into their courtship, Tom was not insisting on meeting in person. Jill

had enough insight into her psychology to realize how, in many ways, this had been the perfect
relationship for her, providing an outlet for libidinal energy without the unbearable stress of
social performance and interpersonal contact. But why wasnt Tom more persistent? What was
his problem? This question plagued Jill and, more than any other consideration, made her insist
on the fateful meeting. Despite his cold feet and her untruths, she was still intent on trying to
bring her virtual romance to a smooth landing in the real world, and I was to help her navigate
this emotional minefield.
Where to meet, what to wear, and in what order to begin correcting the many fallacies that
separated Jill from Tess? My patient struggled mightily with these questions as she prepared to
meet her online love. As it turned out, however, she neednt have worried at all. Or, perhaps
more accurately, she should have been doubly worried, for Tom had a few secrets of his own to
confess. For starters, he was really Ted, and the doctor Jill thought she had landed was a
pharmacist who had always wanted to go to medical school. The two were playing the same
game: Jill was increasing her appeal by pretending to be less socially inhibited, and Tom was
increasing his by elevating himself on the socioeconomic and alpha male ladder. Somewhere in
cyberspace their trajectories collided, with much potential and heartache resulting.
Yet, far from con artists, Tom and Jill were using the virtual world to overcome limitations that
they felt were unjust, and both were helpless in front of a game that had self-perpetuating,
snowballing tendencies built into
itand real rewards associated with it, too. For, regardless of how we might judge it and
regardless of the outcome, it would be impossible to deny the intensity, even the genuineness,
of the pleasurable, ego-boosting emotions/emoticons that the Internet helped generate between
Tess and Tom, whoever they really were.
For them, the face-to-face meeting must have been a terrifying reminder of all the old anxieties,
inhibitions, and baggage that they were able to ignore for a long while onlinea reminder of
what mere mortals they still were. In its ultimate philosophical interpretation, this confrontation
with reality was a brutal reminder of death itself, the devastating death of an ideal personality
that they wished they hadfreer, sexier and, in their eyes, more worthy of being loved than the
one they were stuck with in real life. And neither he nor she could tolerate it. According to Jill, the
meeting ended with the couple, having shared a string of disappointing revelations, separately
leaving the caf where they had met, having decided not to pursue a relationship that was too
burdened by lies to have a true chance at success. That evening, they logged on from their
home computers, found each other online, chatted briefly, then found a reason to log off. From
what I could glean, it was a perfectly pleasant but superficial IM exchange, devoid of blame
gametype accusations, but also of any big love pronouncements. They would not, the following
day, obsessively check e-mail to see if the other person had sent a sweet note.

In-class resources for Thu Oct 8 (class 6)


Internet addiction
Digerati video and reading
Digerati
James Paul Gee
BHSEC Q guest author (Fall 2015)
https://en.wikipedia.org/wiki/James_Paul_Gee
Video
We will watch and discuss this in class.
Gee: On Video Games (5:51)
https://www.youtube.com/watch?v=LNfPdaKYOPI
Reading
We will read and discuss this in class.
What Video Games Have to Teach Us About Learning and Literacy
(James Paul Gee, 2003)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Technology and addiction
Video games are good for you
Video games are bad for you
Elias Aboujaoude and Virtually You

Good video games adjust to the learner


James Paul Gee, What Videogames Teach Us About Learning and Literacy (2003)
A good video game operates within the learner's "regime of competence." By this I mean that
the game often operates within, but at the outer edge of, the learner's resources, so that at
many points the game is felt as challenging but not "undoable." If learning always operates well
within the learner's resources, then all that happens is that the learner's behaviors get more and
more routinized, as the learner continues to experience success by doing the same things. This
is good, as we have seen, for learning and practicing fluent and masterful performance (which is,
indeed, necessary), but it is not good for developing newer and higher skills.
Learning to play video games is learning a new literacy
James Paul Gee, What Videogames Teach Us About Learning and Literacy (2003)
When people learn to play video games, they are learning a new literacy. Of course, this is not
the way the word literacy is normally used. Traditionally, people think of literacy as the ability
to read and write. Why, then, should we think of literacy more broadly, in regard to video games
or anything else, for that matter? In the modern world, language is not the only important
communicational system. Today images, symbols, graphs, diagrams, artifacts, and many other
visual symbols are particularly significant. Thus, the idea of different types of visual literacy
would seem to be an important one. For example, being able to read the images in advertising
is one type of visual literacy.
Learning principles derived from video games
James Paul Gee, What Videogames Teach Us About Learning and Literacy (2003)

"Psychosocial Moratorium" Principle


Learners can take risks in a space where real-world consequences are lowered.

Committed Learning Principle

Learners participate in an extended engagement (lots of effort and practice) as extensions of


their real world identities when they feel some commitment and a virtual world that they find
compelling.
Identity Principle

Learning involves playing with identities so that the learner has real choices (in developing
the virtual
identity) and opportunity to meditate on the relationship between new identities and old
ones.
Self-Knowledge Principle

The virtual world is constructed in such a way that learners learn not only about the domain
but about themselves and their current and potential capacities.

Amplification of Input Principle


For a little input, learners get a lot of output.

Achievement Principle

For learners of all levels of skill there are intrinsic rewards from the beginning, customized to
each
learner's level, effort, and growing mastery and signaling the learner's ongoing
achievements.

Sunday Story #4
Due Sun Oct 11 (before midnight)
Prompt

medium-gee

Homework for Tue Oct 13 (class 7)


Guest author: James Paul Gee
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: How video games are good for you
How video games blind us with science
(Clive Thompson, WIRED, 2008)
Video
Watch the assigned video on Zaption and post a one-sentence response.
James Paul Gee (6:38)
http://zapt.io/tz64bkxm
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Share at least three "notes" on "Sunday Story 3" Medium student work

Games enhance your mental skills


Drs. Gary Small and Gigi Vorgan iBrain: Surviving the Technological Alteration of the Modern Mind
(2008)
Thinking skills enhanced by repeated exposure to computer games and other digital media include
reading visual images as representations of three-dimensional space (representational
competence), multidimensional visual-spatial skills, mental maps, mental paper folding
(i.e., picturing the results of various origami-like folds in your mind without actually doing them),
inductive discovery (i.e., making observations, formulating hypotheses and figuring out the
rules governing the behavior of a dynamic representation), attentional deployment (such as
monitoring multiple locations simultaneously), and responding faster to expected and
unexpected stimuli.
Games Provide Flow
William Davidow, Overconnected: The Promise and Threat of the Internet (2011)
The fact that human learning is a practice effect can create a good deal of difficulty for learning
in school. Children cannot learn in a deep way if they have no opportunities to practice what they
are learning. They cannot learn deeply only by being told things outside the context of embodied
actions . Yet at the same time, children must be motivated to engage in a good deal of practice if
they are to master what is to be learned . However, if this practice is boring, they will resist it.
Good video games involve the player in a compelling world of action and interaction, a world to
which the learner has made an identity commitment, in the sense of engaging in the sort of play
with identities we have discussed. Thanks to this fact, the player practices a myriad of skills,
over and over again, relevant to playing the game, often without realizing that he or she is
engaging in such extended practice sessions. Educators often bemoan the fact that video games
are compelling and school is not. They say that children must learn to practice skills ("skill and
drill") outside of meaningful contexts and outside their own goals : It's too bad, but that's just the
way school and, indeed, life is, they claim.
Benefits of Games
William Davidow, Overconnected: The Promise and Threat of the Internet (2011)

In a recent issue of the Harvard Business Review, two major thinkers have even argued that the
teenage gamer might signal the skills needed for potential leadership in politics and commerce
for our age. John Seely Brown, one of the earliest and perennial visionaries of the information
age, and Douglas Thomas, a communications professor at the University of Southern California
and himself a champion gamer, have noted that multiplayer online games are large, complex,
constantly evolving social systems. These researchers have defined five key attributes of what
they term the gamer disposition:
they are bottom-line oriented (because games have embedded systems of measurement and
assessment);
they understand the power of diversity (because success requires teamwork among those with a
rich mix of talents and abilities);
they thrive on change (because nothing is constant in a game);
they see learning as fun (because the fun of the game lies in learning how to overcome
obstacles); and they marinate on the edge (because, to succeed, they must absorb radical
alternatives and imbibe innovative strategies for completing tasks).
It is hard to think of five better qualities for success in our always-on digital age.

In-class resources for Tue Oct 13 (class 7)


Guest author: James Paul Gee
Digerati video and reading
(discussion leader: x)
Digerati
Andrew Keen
@ajkeen
BHSEC Q guest author (Fall 2012)
https://en.wikipedia.org/wiki/Andrew_Keen
Video
We will watch and discuss this in class.
Andrew Keen on Colbert Report (5:17)
http://thecolbertreport.cc.com/videos/u8nc37/andrew-keen
Reading
We will read and discuss this in class.
Cult of the Amateur
(Andrew Keen, 2011)
Links:
(on Canvas lesson page)
Andrew Keen

A culture of digital narcissism?


Andrew Keen, The Cult of the Amateur (2008)
(My arguments are that) MySpace and Facebook are creating a youth culture of digital
narcissism; open-source knowledge-sharing sites like Wikipedia are undermining the authority of
teachers in the classroom; the YouTube generation are more interested in self-expression than in
learning about the outside world; the cacophony of anonymous blogs and user-generated
content are deafening today's youth to the voices of informed experts and professional
journalists: today's kids are so busy self- broadcasting on social networks that they no longer
consume the creative work of professional musicians, novelists, or filmmakers.
The Demise of Newspapers
Andrew Keen, The Cult of the Amateur (2008)
Should mainstream newspapers and television fold, where will online news sites get their
content? Where will the Matt Drudges and the instapundits get their information? How can they
comment on the war in Afghanistan, or the 2012 election, if there is no organization with clout
and sufficient resources to report on it? In the absence of traditional news, will the online sites be
forced to abandon the effort to search out the truth altogether and simply make the facts up?
Who will have the resources to investigate and report on the next Watergate scandal or to pay
the wages of the 2.0 versions of Carl Bernstein and Bob Woodward? Or will this kind of quality
reportage simply cease to exist? As a 2006 report from the Carnegie Corporation of New York put
it, As newspapers begin to fade, are the institutions that replace them up to the task of
sustaining the informed citizenry on which democracy depends?
The New Normal
Andrew Keen, The Cult of the Amateur (2008)
When an article runs under the banner of a respected newspaper, we know that it has been
weighted by a team of seasoned editors with years of training, assigned to a qualified reporter,
researched, fact-checked, edited, proofread, and backed by a trusted news organization
vouching for its truthfulness and accuracy. Take those filters away, and we, the general public,
are faced with the impossible task of sifting through and evaluating an endless sea of the
muddled musings of amateurs. Most of us assume that the information we take in can be
trusted. But when the information is created by amateurs, it rarely can be. And the irony in all
this is that democratized media will eventually force all of us to become amateur critics and
editors ourselves. With more and more of the information online unedited, unverified, and
unsubstantiated, we will have no choice but to read everything with a skeptical eye. The free
information really isn't free; we all end up paying for it one way or another with the most
valuable resource of allour time.
The true face of digital democracy
Sebastian Walsman, The New Atlantis (2009)
Two years ago, Andrew Keens The Cult of the Amateur launched a serious debate about the
value of blogs and other technologies that permit ordinary citizens to publish their own content
online. Keens book took aim at the Web 2.0 enthusiasts who believe that the Internet can and
should empower citizen journalists and democratize the media. He disparaged the selfbroadcasting movement for promoting narcissism and mediocrity, and cited the trend as a threat
to our moral and cultural integrity. He argued that these tools will diminish respect for the
knowledge and experience of journalists and may accelerate the spread of misinformation and
partisan spin. The evidence indicates that the dynamic of online concentration has harmed local
and regional news outlets. Small, independent, and community newspapers now compete with
the New York Times and the Washington Post. (The result is that readers have less and less)
exposure to local news. (Readers) disengage from the world around them and take in only the
high-profile stories that national media see fit to produce. Regularly reading about our neighbors
and neighborhoods helps to shape our identities as individuals embedded in a particular place
and time. Local media have long contributed to this process and have helped to strengthen our
communities. If regional and local papers disappear, with only national and international news

sources like CNN left standing, we may regret having nowhere to read about recent city council
meetings, church picnics, school fundraisers, and other matters of the kind of community
concern that have long been integral to American civic life.

\
Homework for Thu Oct 15 (class 8)
The rise of digital journalism
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: The rise of digital journalism
Video
Watch the assigned videos on Zaption and post a one-sentence response to each.
How is Social Media Changing Journalism? (2:19)
http://zapt.io/txh3hzdq
Social media is (3:00)
http://zapt.io/t3zhx4pr
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
What role do newspapers play in local communities? In democratic
society?
https://canvas.instructure.com/courses/936923/discussion_topics/4017322

Newspapers and magazines are adapting to our new way of reading


John Freeman, The Tyranny of E--Mail (2009)
Newspaper and magazine articles have become shorter, breaks longer, and text bigger to
accommodate readers fractured attention span. We ourselves create this condition, though, by
how much time we spend working in word processing programs and on e-mail. E-mailers tend
there being no space constraintsto insert a line of space between paragraphs, writes the
humorist and language columnist Roy Blount, Jr., in an e-mail. If readers get to where they cant
tolerate paragraphs without space between, they will develop an even greater resistance to
print; or print will have to put space between paragraphs, which will eat up more paper, make
books bulkier, leach even more substance out of newspapers and magazines: contribute further
to the decline of print.
A newspaper that chooses its stories solely based on clicks
Eli Pariser, The Filter Bubble: What the Internet is Hiding from You (2011)
Las ltimas Noticias, a major paper in Chile, began basing its content entirely on what readers
clicked on in 2004: Stories with lots of clicks got follow-ups, and stories with no clicks got killed.
The reporters dont have beats anymorethey just try to gin up stories that will get clicks.
With internet news, each story has to stand on its own feet
Eli Pariser, The Filter Bubble: What the Internet is Hiding from You (2011)
This is Internet news: Each article ascends the most forwarded lists or dies an ignominious death
on its own. In the old days, Rolling Stone readers would get the magazine in the mail and leaf
through it; now, the popular stories circulate online independent of the magazine. I read the
expos on General Stanley McChrystal but had no idea that the cover story was about Lady
Gaga. The attention economy is ripping the binding, and the pages that get read are the pages
that are frequently the most topical, scandalous, and viral.
Read/Write Web
Will Richardson, Personal Learning Networks (2011)
The Read/Write Web has created millions of amateur reporters who now have their own digital
printing presses. Its also created millions of amateur editors who are, in blogging parlance,
ready to fact check your a-- whenever a major story breaks. And today, even the newspapers
themselves are inviting their readers to participate, understanding what former-reporter turnedblogger Dan Gillmor knew early on: If my readers know more than I do (which I know they do), I
can include them in the process of making my journalism better (as cited in Kosman, 2005).
How Digital Natives "Experience" the News
John Palfrey, Born Digital (2008)
Just because Digital Natives learn differently from the way their parents did when they were
growing up doesnt mean that Digital Natives are not learning. Take, for example, the way that
Digital Natives learn about events in the news. Many older people assume that because Digital
Natives are not reading newspapers and magazines, but instead absorbing news all day long on
various websites (and from comedy programs and other unconventional sources), their
understanding of current events is superficial and limited to headlines. These assumptions are
wrong, because they underestimate the depth of knowledge that Digital Natives are obtaining on
the Web. They also miss a key feature of how Digital Natives experience news: interacting with
information in constructive ways. Digital Natives often access much more information about a
topic they are interested in than kids of previous generations ever could have. A recent study of
young people and their news--- gathering habits confirms these changes. The study found, for
example, that young Americans dont read the daily newspaper. Digital Natives pick up bits and
pieces of news and information as they go about their day, not in a single sitting at the breakfast
table in the morning or in front of the television in the evening. And often, they in fact engage
more with the material than those who are used to more traditional news formats, by virtue of
writing a post about the idea on a blog or sharing it with a friend on Facebook or over instant
messaging.

Excerpts from lecture titled Is News Over? (March 17, 2010)


Professor George Brock, Head of Journalism, City University London

Excerpt 1:
Distilling the effects on news, we can separate out three irreversible shifts:
First, in the quantity of information available. When journalism began, reliable information was
scarce; despite the inaccuracy of much that you can find nowadays, news is in glut.
Second big change: the instant alteration of information. Cable and satellite gave us rolling 24hour news. The internet allows that to be updated, nuanced, corrected continuously from many
different directions. Those who enjoy this say that news has become a process or a
conversation. Those who do not enjoy this say that news is losing at least some of its authority,
clarity and coherence.
The third change is the most profound. The production and consumption of news has been
decoupled from advertising and its previous sources of income. First and foremost that causes an
economic crisis. The readers and watchers of mass media news have never paid the full cost in
subscription or cover price. In print, advertising is somewhere between half and three-quarters of
the income needed to keep a quality newspaper going. Many newspapers, and particularly local
and regional ones, received the majority of their income from classified ads, usually for jobs,
houses and cars. Those small ads have transferred to the web. Not will transfer, but have;
past tense.
Excerpt 4:
Imagine a common enough event in any large city. It is a Friday evening and a young woman on
her way home after a night out is attacked and robbed. The attack is not fatal, but serious
enough to put her in hospital overnight and to involve the police. By the next morning, her social
network will have been alerted. She might have triggered this by tweeting, posting on Facebook
or another social networking site or simply by sending emails or texts from a phone. Before 24
hours have gone by, anything up to several hundred people will have read and discussed the
details. What does the local paper do? A routine police check which might put a few paragraphs
on the website on the Saturday and in Monday evenings edition. It wasnt a murder after all. In
print, more potential readers but less engagement. Conventional newss disadvantage is not
really lack of speed, although it will be slower. It is also that the formal, one-size-fits-all news
arrives in less satisfying form. For this is both a public event (police, hospital and a possible court
case) and a private one belonging to a network. The private version of the news will be swifter,
richer and more detailed and authentic, carried by voices who know eachother. As young
consumers of news say nowadays: If the news is important enough it will find me.
The Exhilaration of Direct Broadcasting
Andrew Sullivan, The Atlantic (2008)
The simple experience of being able to directly broadcast my own words to readers was an
exhilarating literary liberation...It was obvious from the start that (blogging) was revolutionary.
Every writer since the printing press has longed for a means to publish himself and reach
instantlyany reader on Earth. Every professional writer has paid some dues waiting for an
editors nod, or enduring a publishers incompetence, or being ground to literary dust by a legion
of fact-checkers and copy editors. If you added up the time a writer once had to spend finding an
outlet, impressing editors, sucking up to proprietors, and proofreading edits, youd find another
lifetime buried in the interstices. But with one click of the Publish Now button, all these troubles
evaporated. Reporters and columnists tend to operate in a relative sanctuary, answerable mainly
to their editors, not readers. For a long time, columns were essentially monologues published to
applause, muffled murmurs, silence, or a distant heckle. Ive gotten blowback from pieces before
but in an amorphous, time-delayed, distant way. Now feedback is instant, personal, and brutal.
The Cocktail: Social Networks, Live Searching, and Link-Sharing
Steven Johnson, TIME magazine (2009)
Twitter put three elements together social networks, live searching and link-sharing and
created a cocktail that poses what may amount to the most interesting alternative to Google's

near monopoly in searching. At its heart, Google's system is built around the slow, anonymous
accumulation of authority: pages rise to the top of Google's search results according to, in part,
how many links point to them, which tends to favor older pages that have had time to build an
audience. That's a fantastic solution for finding high-quality needles in the immense, spamplagued haystack that is the contemporary Web. But it's not a particularly useful solution for
finding out what people are saying right now, the in-the-moment conversation that industry
pioneer John Battelle calls the "super fresh" Web. Even in its toddlerhood, Twitter is a more
efficient supplier of the super-fresh Web than Google. If you're looking for interesting articles or
sites devoted to Kobe Bryant, you search Google. If you're looking for interesting comments from
your extended social network about the three-pointer Kobe just made 30 seconds ago, you go to
Twitter.

In-class resources for Thu Oct 15 (class 8)


The rise of digital journalism
Digerati video and reading
(discussion leader: x)
Digerati
Matt Richtel
@mrichtel
BHSEC Q guest author (Fall 2015)
https://en.wikipedia.org/wiki/Matt_Richtel
Video
We will watch and discuss this in class.
Matt Richtel: A Deadly Wandering trailer (2:11)
https://www.youtube.com/watch?v=TcQY14n_Xe4
Reading
We will read and discuss this in class.
A Deadly Wandering
(Matt Richtel, 2014)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
The decline of newspapers
New models for news
The rise of Facebook Snapchat Twitter news
Robot journalists

Attention Must Be Paid


'A Deadly Wandering: A tale of tragedy and redemption in the age of attention,' by
Matt Richtel
(Robert Kolker, NYT Book Review, 2014)
Reggie Shaw is the man responsible for the most moving portion of From One Second to the
Next, the director Werner Herzogs excruciating (even by Werner Herzog standards) 35-minute
public service announcement, released last year as part of AT&Ts It Can Wait campaign
against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in
September 2006 that he crossed the line of a Utah highway, knocking into a car containing two
scientists, James Furfaro and Keith ODell, who were heading to work nearby. Both men were
killed. Shaw says he was texting a girlfriend at the time, adding in unmistakable anguish that he
cant even remember what he was texting about. He is next seen taking part in something
almost inconceivable: He enters the scene where one of the dead mens daughters is being
interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug.
Reggie Shaws redemptive journey from thoughtless, inadvertent killer to denier of his own
culpability to one of the nations most powerful spokesmen on the dangers of texting while
behind the wheel was first brought to national attention by Matt Richtel, a reporter for The
New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010.
Now, five years later, in A Deadly Wandering, Richtel gives Shaws story the thorough,
emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted
driving. As an instructive social parable, Richtels densely reported, at times forced yet
compassionate and persuasive book deserves a spot next to Fast Food Nation and To Kill a
Mockingbird in Americas high school curriculums. To say it may save lives is self-evident.
What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the
morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men
die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which
may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first
because he was in denial, or because technology can hijack the brain, polluting his memory. In
short chapters that break up the story of the crash, Richtel delivers the history of cognitive
neuroscience, from its origins in World War II, helping pilots and radar operators save lives by not
being overwhelmed by the technology in front of them, to later M.R.I. brain studies of
multitasking and what came to be called attention science. Richtel presents each scholar,
researcher, study and theory largely without judgment, but the even-handedness has a leveling
effect that makes it hard to know what the author feels is most important. The larger potential
problem here, perhaps, is that generally speaking, the big takeaway of the texting-and-driving
question just isnt that complicated: Put down the frickin phone.
Still, there are rewards. Richtel explains how researchers have found that distraction is the
antagonist of attention, not its opposite. Its an interesting distinction. Distraction is the devil in
your ear not always the result of an attention deficit, but borne of our own desires. We are
distracted because we want to be. Why else would they sell so many smartphones? As Richtel
explains, a good gadget is essentially magical, commandeering our focus with delight and
surprise and ease (Steve Jobs used the word magical about the iPhone when it debuted). The
smartphone brilliantly exploits both types of attention, top down (what we want to focus on)
and bottom up (what takes us by surprise). The intimacy of smartphones is, if not addictive,
then certainly seductive. Not all distractions are created equal: The impairment of drunken
driving, for instance, is consistently huge, while the impairment of texting is arguably more
intense but shorter in duration. The researchers Richtel quotes have found that drivers are
impaired for up to 15 seconds after they text far longer than most drivers would ever think.
The stronger a phones hold has on us, the more money the phone companies can make.
Richtels account of ways the telecommunications industry originally - suppressed safety
concerns over cellphone use while driving is blood-boiling.
Reggie Shaw, meanwhile, is meant to perform as a proxy for a generation that grew up on
Nintendo and personal computers the all-American boy who always made the right
choices, yet whose digital life encroached on his real one. Shaw grows more interesting when
his quirks shine through. A fundamentally decent teenager, Shaw nevertheless had things he

was ashamed of and family expectations to live up to. His pattern, even before the crash, was to
dissemble in order not to make trouble for those around him. Once the tragedy happened,
Richtel writes, the intensity with which the family undertook the defense had a selfperpetuating and escalating force: Reggie denied texting, the family backed him up and Reggie,
never someone to let others down, dug deeper.
Richtel locates not one but two Inspector Javert types: the state trooper who responded to the
crash and almost immediately decided Shaw was lying about not texting (He kind of goes after
people, an attorney says about him), and a victims advocate named Terryl Warner, whose own
story is every bit as fascinating and redemptive as Shaws. The prelude to the trial is fascinating:
Should Reggie be charged with negligence or manslaughter, or nothing at all? Even if texting and
driving is wrong, should he have known that? In Richtels sensitive account, we come face to
face with the horrible Catch-22 of accident litigation that discourages one party from apologizing
to another, for fear of admitting liability. This apparent standoffishness helped persuade the
prosecutor to make Shaw a test case for texting and driving. Which in turn caused Shaws family
to accuse the prosecutor of waging a witch hunt. Which only appalled the victims widows and
families and advocates even more.
Richtel displays admirable empathy for everyone involved but reserves a special place in his
heart for Reggie impassive and forlorn, monosyllabic but tortured, evasive yet sincere. Shaws
conversion is depicted with revelatory precision, his epiphany realistically subdued and
painstakingly gradual. The fight seemed to be going out of him bit by bit, Richtel writes, before
the floodgates opened, his private and public selves beginning to reconcile. By the books end,
Shaw is a raw nerve, unable to stop confessing in speeches around the country. Even the
relatives of those he killed worry hell never be able to close the floodgates again.
The most powerful question raised by A Deadly Wandering is a simple one: If we know texting
and driving is so bad for us, why do we still do it? Richtel tries out several analogies to describe
the rush we get from a phone: alcohol, drugs, television, video games, junk food, the fight-orflight response to a tap on a shoulder. (The television comparison is weakest, perhaps because
so few of the people in the book agree with it.) My favorite analogy of Richtels is the slot
machine. Our bodies love the little hit of dopamine we get each time we check our phones for
something, anything. And just like a one-armed bandit, more often than not, our phones rarely
offer terribly exciting results when we check them. Even so, that doesnt stop us from coming
back for more dozens of times a day during movies, out at dinner, on our way to wherever
were going, unsafe at any speed.

Sunday Story #5
Due Sun Oct 18 (before midnight)
Prompt

medium-richtel

Homework for Tue Oct 20 (class 9)


Guest author: Matt Richtel
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Video
Watch the assigned video on Zaption and post a one-sentence response.
Matt Richtel interviewed by Andrew Keen (11:00)
http://zapt.io/t27t4tvr
Online discussion
Respond to this prompt on Canvas:

Share at least three comments in response to Sunday Story 4 student


workIn-class resources for Tue Oct 20 (class 9)
Guest author: Matt Richtel
Digerati video and reading
(discussion leader: x)
Digerati
Jeff Bezos
https://en.wikipedia.org/wiki/Jeff_Bezos
Video
We will watch and discuss this in class.
Jeff Bezos on The Daily Show (5:22)
http://thedailyshow.cc.com/videos/9aq77e/bezos-and-masthead
Reading
We will read and discuss this in class.
The Everything Store: Jeff Bezos and the Age of Amazon
(Brad Stone, 2013)

The Everything Store: Jeff Bezos and the Age of Amazon


Brad Stone (2012)
Within Amazon.com (AMZN) theres a certain type of e-mail that elicits waves of panic. It usually
originates with an annoyed customer who complains to the companys founder and chief executive
officer. Jeff Bezos has a public e-mail address, jeff@amazon.com. Not only does he read many customer
complaints, he forwards them to the relevant Amazon employees, with a one-character addition: a
question mark.
When Amazon employees get a Bezos question mark e-mail, they react as though theyve discovered a
ticking bomb. Theyve typically got a few hours to solve whatever issue the CEO has flagged and
prepare a thorough explanation for how it occurred, a response that will be reviewed by a succession of
managers before the answer is presented to Bezos himself. Such escalations, as these e-mails are
known, are Bezoss way of ensuring that the customers voice is constantly heard inside the company.
One of the more memorable escalations occurred in late 2010. It had come to Bezoss attention that
customers who had browsed the lubricants section of Amazons sexual wellness category were receiving
personalized e-mails pitching a variety of gels and other intimacy facilitators. When the e-mail
marketing team received the question mark, they knew the topic was delicate and nervously put
together an explanation. Amazons direct marketing tool was decentralized, and category managers
could generate e-mail campaigns to customers who had looked at certain product categories but did not
make purchases. The promotions tended to work; they were responsible for hundreds of millions of
dollars in Amazons annual sales. In the matter of the lubricant e-mail, though, a low-level product
manager had overstepped the bounds of propriety. But the marketing team never got the chance to
send this explanation. Bezos demanded to meet in person.
At Amazons Seattle headquarters, Jeff Wilke, the senior vice president for North American retail, Doug
Herrington, the vice president for consumables, and Steven Shure, the vice president for worldwide
marketing, waited in a conference
room until Bezos glided in briskly. He started the meeting with his customary, Hello, everybody, and
followed with So, Steve Shure is sending out e-mails about lubricants.
Bezos likes to say that when hes angry, just wait five minutes, and the mood will pass like a tropical
squall. Not this time. He remained standing. He locked eyes with Shure, whose division oversaw e-mail
marketing. I want you to shut down the channel, he said. We can build a $100 billion company
without sending out a single f------ e-mail.
An animated argument followed. Amazons culture is notoriously confrontational, and it begins with
Bezos, who believes that truth shakes out when ideas and perspectives are banged against each other.
Wilke and his colleagues argued that lubricants were available in supermarkets and drugstores and
were not that embarrassing. They also pointed out that Amazon generated a significant volume of sales
with such e-mails. Bezos didnt care; no amount of revenue was worth jeopardizing customer trust.
Who in this room needs to get up and shut down the channel? he snapped.
Eventually, they compromised. E-mail marketing would be terminated for certain categories such as
health and personal care. The company also decided to build a central filtering tool to ensure that
category managers could no longer promote sensitive products, so matters of etiquette were not
subject to personal taste. For books and electronics and everything else Amazon sold, e-mail marketing
lived to fight another day.
Amazon employees live daily with these kinds of fire drills. Why are entire teams required to drop
everything on a dime to respond to a question mark escalation? an employee once asked at the
companys biannual meeting held at Seattles KeyArena, a basketball coliseum with more than 17,000
seats. Every anecdote from a customer matters, Wilke replied. We research each of them because
they tell us something about our processes. Its an audit that is done for us by our customers. We treat
them as precious sources of information.
Its one of the contradictions of life inside Amazon: The company relies on metrics to make almost every
important decision, such as what features to introduce or kill, or whether a new process will root out an
inefficiency in its fulfillment centers. Yet random customer anecdotes, the opposite of cold, hard data,
can also alter Amazons course.

Its easy to forget that until recently, people thought of Amazon primarily as an online bookseller. Today,
as it nears its 20th anniversary, its the Everything Store, a company with around $75 billion in annual
revenue, a $140 billion market value, and few if any discernible limits to its growth. In the past few
months alone, it launched a marketplace in India, opened a website to sell high-end art, introduced
another Kindle reading device and three tablet computers, made plans to announce a set-top box for
televisions, and funded the pilot episodes of more than a dozen TV shows. Amazons marketplace hosts
the storefronts of countless smaller retailers; Amazon Web Services handles the computer infrastructure
of thousands of technology companies, universities, and government agencies.
Bezos, 49, has a boundless faith in the transformative power of technology. He constantly reinvests
Amazons profits to improve his existing businesses and explore new ones, often to the consternation of
shareholders. He surprised the world in August when he personally bought the Washington Post
newspaper, saying his blend of optimism, innovation, and long-term orientation could revive it. One day
a week, he moonlights as the head of Blue Origin, his private rocket ship company, which is trying to
lower the cost of space travel.
Amazon has a few well-known peculiaritiesthe desks are repurposed doors; meetings begin with
everyone in the room sitting in silence as they read a six-page document called a narrative. Its a
famously demanding place to work. And yet just how the company worksand what Bezos is like as a
personis difficult to know.
Bezos rarely speaks at conferences and gives interviews only to publicize new products, such as the
latest Kindle Fire. He declined to comment on this account, saying that its too early for a reflective
look at Amazons history, though he approved many interviews with friends, family, and senior Amazon
executives. John Doerr, the venture capitalist who backed Amazon early and was on its board of
directors for a decade, calls Amazons Berlin Wall approach to public relations the Bezos Theory of
Communicating. Its really just a disciplined form of editing. Bezos takes a red pen to press releases,
product descriptions, speeches, and shareholder letters, crossing out anything that doesnt convey a
simple message: You wont find a cheaper, friendlier place to get everything you need than Amazon.
The one unguarded thing about Bezos is his laugha pulsing, mirthful bray that he leans into while
craning his neck back. He unleashes it often, even when nothing is obviously funny to anyone else. And
it startles people. You cant misunderstand it, says Rick Dalzell, Amazons former chief information
officer, who says Bezos often wields his laugh when others fail to meet his lofty standards. Its
disarming and punishing. Hes punishing you.
Intensity is hardly rare among technology CEOs. Steve Jobs was as famous for his volatility with Apple
(AAPL) subordinates as he was for the clarity of his insights about customers. He fired employees in the
elevator and screamed at underperforming executives. Bill Gates used to throw epic tantrums at
Microsoft (MSFT); Steve Ballmer, his successor, had a propensity for throwing chairs. Andy Grove, the
former CEO of Intel (INTC), was so harsh and intimidating that a subordinate once fainted during a
performance review.
Bezos fits comfortably into this mold. His drive and boldness trumps other leadership ideals, such as
consensus building and promoting civility. While he can be charming and capable of great humor in
public, in private he explodes into what some of his underlings call nutters. A colleague failing to meet
Bezoss exacting standards will set off a nutter. If an employee does not have the right answers or tries
to bluff, or takes credit for someone elses work, or exhibits a whiff of internal politics, uncertainty, or
frailty in the heat of battlea blood vessel in Bezoss forehead bulges and his filter falls away. Hes
capable of hyperbole and harshness in these moments and over the years has delivered some
devastating rebukes. Among his greatest hits, collected and relayed by Amazon veterans:
Are you lazy or just incompetent?
Im sorry, did I take my stupid pills today?
Do I need to go down and get the certificate that says Im CEO of the company to get you to stop
challenging me on this?
Are you trying to take credit for something you had nothing to do with?

If I hear that idea again, Im gonna have to kill myself.


We need to apply some human intelligence to this problem.
[After reviewing the annual plan from the supply chain team] I guess supply chain isnt doing anything
interesting next year.
[After reading a start-of-meeting memo] This document was clearly written by the B team. Can
someone get me the A team document? I dont want to waste my time with the B team document.
[After an engineers presentation] Why are you wasting my life?
Some Amazon employees advance the theory that Bezos, like Jobs, Gates, and Oracle (ORCL) cofounder Larry Ellison, lacks empathy. As a result, he treats workers as expendable resources without
taking into account their contributions. That in turn allows him to coldly allocate capital and manpower
and make hyperrational business decisions, where another executive might let emotion and personal
relationships figure into the equation. They also acknowledge that Bezos is primarily consumed with
improving the companys performance and customer service and that personnel issues are secondary.
This is not somebody who takes pleasure at tearing someone a new a--hole, says Kim Rachmeler, an
executive who worked at Amazon for more than a decade. He is not that kind of person. Jeff doesnt
tolerate stupidity, even accidental stupidity.
To the amazement and irritation of employees, Bezoss criticisms are almost always on target. Bruce
Jones, a former Amazon supply chain vice president, describes leading a five-engineer team figuring out
ways to make the movement of workers in fulfillment centers more efficient. The group spent nine
months on the task, then presented their work to Bezos. We had beautiful documents, and everyone
was really prepared, Jones says. Bezos read the paper, said, Youre all wrong, stood up, and started
writing on the whiteboard.
He had no background in control theory, no background in operating systems, Jones says. He only
had minimum experience in the distribution centers and never spent weeks and months out on the
line. But Bezos laid out his argument on the whiteboard, and every stinking thing he put down was
correct and true, Jones says. It would be easier to stomach if we could prove he was wrong, but we
couldnt. That was a typical interaction with Jeff. He had this unbelievable ability to be incredibly
intelligent about things he had nothing to do with, and he was totally ruthless about communicating it.
Jones cites another example. In 2002, Amazon changed the way it accounted for inventory, from the
last-in first-out, or LIFO, system to first-in first-out, or FIFO. The change allowed Amazon to better
distinguish between its own inventory and the inventory that was owned and stored in fulfillment
centers by partners such as Toys R Us and Target (TGT). Joness supply chain team was in charge of
this complicated effort, and its software, riddled with bugs, created a few difficult days during which
Amazons systems were unable to recognize any revenue. On the third day, Jones was giving an update
on the transition when Bezos had a nutter. He called me a complete f- ----- idiot and said he had no
idea why he hired idiots like me at the company, and said, I need you to clean up your organization,
Jones recalls. It was brutal. I almost quit. I was a resource of his that failed. An hour later he would
have been the same guy as always, and it would have been different. He can compartmentalize like no
one Ive ever seen.
Amazon has a clandestine group with a name worthy of a James Bond film: Competitive Intelligence.
The team, which operated for years within the finance department under longtime executives Tim Stone
and Jason Warnick, focuses in part on buying large volumes of merchandise from other online retailers
and measuring the quality and speed of their serviceshow easy it is to buy, how fast the shipping is,
and so forth. The mandate is to investigate whether any rival is doing a better job than Amazon and
then present the data to a committee of Bezos and other senior executives, who ensure that the
company addresses any emerging threat and catches up quickly.
In the late 2000s, Competitive Intelligence began tracking a rival with an odd name and a strong
rapport with female shoppers. Quidsi (Latin for what if) was a Jersey City company better known for its
website Diapers.com. Grammar school friends Marc Lore and Vinit Bharara founded the startup in 2005
to allow sleep-deprived caregivers to painlessly schedule recurring shipments of vital supplies. By 2008

the company had expanded into selling baby wipes, infant formula, clothes, strollers, and other survival
gear for new parents. In an October 2010 Bloomberg Businessweek cover story, the Quidsi founders
admitted to studying Amazon closely and idolizing Bezos. In private conversations, they referred to
Bezos as sensei.
In 2009, Jeff Blackburn, Amazons senior vice president for business development, ominously informed
the Quidsi co-founders over an introductory lunch that the e-commerce giant was getting ready to
invest in the category and that the startup should think seriously about selling to Amazon. According to
conversations with insiders at both companies, Lore and Bharara replied that they wanted to remain
private and build an independent company. Blackburn told the Quidsi founders that they should call him
if they ever reconsidered.
Soon after, Quidsi noticed Amazon dropping prices up to 30 percent on diapers and other baby
products. As an experiment, Quidsi executives manipulated their prices and then watched as Amazons
website changed its prices accordingly. Amazons pricing botssoftware that carefully monitors other
companies prices and adjusts Amazons to matchwere tracking Diapers.com.
At first, Quidsi fared well despite Amazons assault. Rather than attempting to match Amazons low
prices, it capitalized on the strength of its brand and continued to reap the benefits of strong word of
mouth. After a while, the heated competition took a toll on the company. Quidsi had grown from nothing
to $300 million in annual sales in just a few years, but with Amazon focusing on the category, revenue
growth started to slow. Venture capitalists were reluctant to furnish Quidsi with additional capital, and
the company was not yet mature enough for an initial public offering. For the first time, Lore and
Bharara had to think about selling.
Meanwhile, Wal-Mart Stores (WMT) was looking for ways to make up ground it had lost to Amazon and
was shaking up its online division. Wal-Marts then-vice chairman, Eduardo Castro-Wright, took over
Walmart.com, and one of his first calls was to Lore to initiate acquisition talks. Lore said Quidsi wanted
to get close to Zappos moneymore than $500 million, plus additional bonuses spread out over many
years tied to performance goals. Wal-Mart agreed in principle and started due diligence. Mike Duke, WalMarts CEO, visited a Diapers.com fulfillment center in New Jersey.
The formal offer from Bentonville was around $450 millionnowhere near Zappos money.
So Lore picked up the phone and called Amazon. In September 2010, he and Bharara traveled to Seattle
to discuss the possibility of Amazon acquiring Quidsi. While they were in that early morning meeting
with Bezos, Amazon sent out a press release introducing a service called Amazon Mom. It was a sweet
deal for new parents: They could get up to a years worth of free two-day Prime shipping (a program
that usually cost $79 a year). Customers also could get an additional 30 percent off the alreadydiscounted diapers if they signed up for regular monthly deliveries as part of a service called Subscribe
and Save. Back in New Jersey, Quidsi employees desperately tried to call their founders to discuss a
public response to Amazon Mom. The pair couldnt be reached: They were still in the meeting at
Amazons headquarters.
Quidsi could now taste its own blood. At one point, Quidsi executives took what they knew about
shipping rates, factored in Procter & Gambles (PG) wholesale prices, and calculated that Amazon was
on track to lose $100 million over three months in the diaper category alone.
Inside Amazon, Bezos rationalized these moves as being in the companys long-term interest of
delighting its customers and building its consumables business. He told Peter Krawiec, the business
development vice president, not to spend more than a certain amount to buy Quidsi but to make sure
that Amazon did not, under any circumstance, lose the deal to Wal-Mart.
As a result of Bezoss meeting with Lore and Bharara, Amazon had an exclusive three-week period to
study Quidsis financial results and come up with an offer. At the end of that period, Krawiec offered
Quidsi $540 million and called the number a stretch price. Knowing that Wal-Mart hovered on the
sidelines, he gave Quidsi a window of 48 hours to respond and made it clear that if the founders didnt
take the offer, the Amazon Mom onslaught would continue.
Wal-Mart should have had a natural advantage. Jim Breyer, the managing partner at one of Quidsis
venture capital backers, Accel, was also on the Wal-Mart board. But Wal-Mart was caught flat-footed. By
the time it increased its offer to $600 million, Quidsi had tentatively accepted the Amazon term sheet.

Duke left phone messages for several Quidsi board members, imploring them not to sell to Amazon.
Those messages were then transcribed and sent to Seattle, because Amazon had stipulated in the
preliminary term sheet that Quidsi turn over information about any subsequent offer.
When Bezoss lieutenants learned of Wal-Marts counterbid, they ratcheted up the pressure, telling the
Quidsi founders that sensei was such a furious competitor that he would drive diaper prices to zero if
they sold to Bentonville. The Quidsi board convened to discuss the possibility of letting the Amazon deal
expire and then resuming negotiations with Wal-Mart. But by then, Bezoss Khrushchev-like willingness
to use the thermonuclear option had had its intended effect. The Quidsi executives stuck with Amazon,
largely out of fear. The deal was announced on Nov. 8, 2010.
Blackburn, Amazons mergers-and-acquisitions chief, said in a 2012 interview that everything the
company did in the diapers market was planned beforehand and was unrelated to competing with
Quidsi. He said that Quidsi was similar to shoe retailer Zappos, which Amazon acquired in 2009: a
stubbornly independent company building an extremely flexible franchise.
The Federal Trade Commission scrutinized the acquisition for four and a half months, going beyond the
standard review to the second-request phase, where companies must provide more information about a
transaction. The deal raised a host of red flags, such as the elimination of a major player in a
competitive category, according to an FTC official familiar with the review. The merger was eventually
approved, in part because it did not result in a monopoly. Costco Wholesale (COST), Target, and plenty
of other companies sold diapers online and off.
Bezos won, neutralizing an incipient competitor and filling another set of shelves in his Everything Store.
Quidsi soon expanded into pet supplies with Wag.com and toys with Yoyo.com. Wal-Mart missed the
chance to acquire a talented team of entrepreneurs whod gone toe to toe with Amazon in a new
product category. And insiders were once again left marveling at how Bezos had engineered another
acquisition by driving his target off a cliff.
The people who do well at Amazon are often those who thrive in an adversarial atmosphere with almost
constant friction. Bezos abhors what he calls social cohesion, the natural impulse to seek consensus.
Hed rather his minions battle it out backed by numbers and passion, and he has codified this approach
in one of Amazons 14 leadership principlesthe companys highly prized values that are often
discussed and inculcated into new hires:
Have Backbone; Disagree and Commit
Leaders are obligated to respectfully challenge decisions when they disagree, even when doing so is
uncomfortable or exhausting. Leaders have conviction and are tenacious. They do not compromise for
the sake of social cohesion. Once a decision is determined, they commit wholly.
Some employees love this confrontational culture and find they cant work effectively anywhere else.
Everybody knows how hard it is and chooses to be there, says Faisal Masud, who spent five years in
the retail business. You are learning constantly, and the pace of innovation is thrilling. I filed patents; I
innovated. There is a fierce competitiveness in everything you do. The professional networking site
LinkedIn (LNKD) is full of boomerangsAmazon-speak for executives who left the company and then
returned.
But other alumni call Amazons internal environment a gladiator culture and wouldnt think of
returning. Many last less than two years. Its a weird mix of a startup that is trying to be
supercorporate and a corporation that is trying hard to still be a startup, says Jenny Dibble, who was a
marketing manager there for five months in 2011. She found her bosses were unreceptive to her ideas
about using social media and that the long hours were incompatible with raising a family. It was not a
friendly environment, she says. Even leaving Amazon can be a combative processthe company is not
above sending letters threatening legal action if an employee takes a similar job at a competitor. Masud,
who left Amazon for EBay (EBAY) in 2010, received such a threat. (EBay resolved the matter privately.)
Employee churn doesnt seem to damage Amazon, though. The company, aided by the appeal of its
steadily increasing stock price, is an accomplished recruiter of talent. In its second-quarter earnings
report in July, Amazon said its ranks had swelled to 97,000 full-time and part-time employees, up 40
percent from the year before. New hires are given an industry-average base salary, a signing bonus
spread over two years, and a grant of restricted stock units spread over four years. Unlike Google

(GOOG) and Microsoft, whose stock grants vest evenly year by year, Amazon backloads the vesting
toward the end of the four-year period. Employees typically get 5 percent of their shares at the end of
their first year, 15 percent their second year, and then 20 percent every six months over the final two
years. Ensuing grants vest over two years and are also backloaded to ensure that employees keep
working hard and are never inclined to coast.
Managers in departments of 50 people or more are often required to top-grade their subordinates on
a curve and must dismiss the least effective performers. As a result, many Amazon employees live in
perpetual fear; those who manage to get a positive review are often genuinely surprised.
There are few perks or unexpected performance bonuses at Amazon, though the company is more
generous than it was the 1990s, when Bezos refused to give employees city bus passes because he
didnt want to give them any reason to rush out of the office to catch the last bus of the day. Employees
now get cards that entitle them to free rides on Seattles regional transit system. Parking at the
companys offices in South Lake Union costs $220 a month, and Amazon reimburses employeesfor
$180. Conference room tables are a collection of blond-wood door-desks shoved together side by side.
The vending machines take credit cards, and food in the company cafeterias is not subsidized. New
hires get a backpack with a power adapter, a laptop dock, and orientation materials. When they resign,
theyre asked to hand in all that equipmentincluding the backpack. These practices are also
embedded in the sacrosanct leadership principles:
Frugality
We try not to spend money on things that dont matter to customers. Frugality breeds resourcefulness,
self-sufficiency, and invention. There are no extra points for head count, budget size, or fixed expense.
Bezos molded Amazons business principles through two decades of surviving in the thin atmosphere of
low profit margins and fierce skepticism from the outside world. In a way, the entire company is built
around his brainan amplification machine meant to disseminate his ingenuity and drive across the
greatest possible radius. Its scaffolding to magnify the thinking embodied by Jeff, says Wilke, the
senior vice president for North American retail. Jeff was learning as he went along. He learned things
from each of us who had expertise and incorporated the best pieces into his mental model. Now
everyone is expected to think as much as they can like Jeff.
Bezos runs the final meetings in the biannual operating reviews, dubbed OP1 (held over the summer)
and OP2 (after the holidays). Teams work intensely for months preparing for their sessions with the CEO,
drawing up six-page narratives that spell out their plans for the year ahead. A few years ago, the
company refined this process further to make the narratives more easily digestible for Bezos and other
members of his senior leadership group, called the S Team, who cycle through many topics during these
reviews. Now every narrative includes at the top of the page a list of a few rules, called tenets, that
guide the groups hard decisions and allow it to move fast, without constant supervision.
Once a week, usually on Tuesdays, various departments meet with their managers to review
spreadsheets of data important to their business. Customer anecdotes have no place at these meetings;
numbers alone must demonstrate whats working and whats broken, how customers are behaving, and
ultimately how well the company overall is performing. This is what, for employees, is so absolutely
scary and impressive about the executive team. They force you to look at the numbers and answer
every single question about why specific things happened, says Dave Cotter, who spent four years at
Amazon as a general manager in various divisions. Because Amazon has so much volume, its a way to
make very quick decisions and not get into subjective debates. The data doesnt lie.
The metrics meetings culminate every Wednesday with the Weekly Business Review, one of the
companys most important rituals, which is run by Wilke. Sixty managers in the retail business gather in
one room to discuss their departments, share data about defects and inventory turns, and talk about
forecasts and the complex interactions between different parts of the company.
Bezos does not attend these meetings. He spends more time on Amazons newer businesses, such as
Amazon Web Services, the streaming video and music initiatives, and, in particular, the Kindle and
Kindle Fire efforts. (Executives joke darkly that employees cant even pass gas in the Kindle buildings
without the CEOs permission.) But Bezos can always make his presence felt anywhere in the company.

After the lubricant fracas of 2010, for example, e-mail marketing fell squarely under his purview. He
carefully monitored efforts to filter the kinds of messages that could be sent to customers, and he tried
to think about the challenge of e-mail outreach in fresh ways. Then, in late 2011, he had what he
considered to be a significant new idea.
Bezos is a fan of e-mail newsletters such as veryshortlist.com, a daily assortment of cultural tidbits from
the Web, and Cool Tools, a compendium of technology tips and product reviews written by Kevin Kelly, a
co-founder of Wired. Both are short, well-written, and informative. Perhaps, Bezos reasoned, Amazon
should be sending a single well-crafted e-mail every weeka short digital magazineinstead of a
succession of bland, algorithm-generated marketing pitches. He asked Shure, the marketing vice
president, to explore the idea.
From late 2011 through early 2012, Shures group presented a variety of concepts to Bezos. One version
revolved around celebrity Q&As, another highlighted interesting historical facts about products. The
project never progressedit fared poorly in tests with customersand several participants remember
the process as being particularly excruciating. In one meeting, Bezos quietly thumbed through the
mock-ups as everyone waited in silence. Heres the problem with this, Bezos said, according to people
who were present. Im already bored. He liked the last concept the most, which suggested profiling a
selection of products that were suddenly hot, such as Guy Fawkes masks and CDs by the Grammywinning British singer Adele. But the headlines need to be punchier, he told the group, which included
the writers of the material. And some of this is just bad writing. If you were doing this as a blogger, you
would starve.
Finally he turned his attention to Shure, who, like so many other marketing vice presidents throughout
Amazons history, was an easy target.
Steve, why havent I seen anything on this in three months?
Well, I had to find an editor and work through mock-ups.
This is developing too slow. Do you care about this?
Yes, Jeff, we care.
Strip the design down, its too complicated. Also, it needs to move faster!
Jeff Bezos grew up in a tight-knit family, with two deeply involved and caring parents, Jackie and Mike,
and two close younger siblings, Christina and Mark. Jackie, who gave birth to Bezos just two weeks after
she turned 17, was a towering figure of authority to Bezos and his friends. Mike, also known as Miguel,
was a Cuban immigrant who arrived in America at age 18, alone and penniless, knowing only one
English word: hamburger. Through grit and determination, he got a college education and climbed
through the ranks of Exxon (XOM) as a petroleum engineer and manager, in a career that took the
Bezos family to Houston, Pensacola, Fla., Miami, and, after Bezos left for college, cities in Europe and
South America.
Yet for a brief period early in his life, before this ordinary if peripatetic childhood, Bezos lived alone with
his mother and grandparents. And before that, he lived with his mother and his biological father, a man
named Ted Jorgensen. Bezos has said the only time he thinks about Jorgensen is when hes filling out a
medical form that asks for his family history. He told Wired in 1999 that hed never met the man. Strictly
speaking, thats not true: Bezos last saw him when he was 3 years old. And while Bezoss professional
life has been closely studied and celebrated over the last two decades, this story has never been told.
Jorgensen was a circus performer and one of Albuquerques best unicyclists in the 1960s. A newspaper
photograph taken in 1961, when he was 16, shows him standing on the pedals of his unicycle facing
backward, one hand on the seat, the other splayed theatrically to the side, his expression tense with
concentration. The caption says he was awarded most versatile rider in the local unicycle club.
That year, Jorgensen and a half-dozen other riders traveled the country playing unicycle polo in a team
managed by Lloyd Smith, the owner of a local bike shop. Jorgensens team was victorious in places such
as Newport Beach, Calif., and Boulder, Colo. The Albuquerque Tribune has an account of the event: Four

hundred people showed up at a shopping center parking lot in freezing weather to watch the teams
swivel around in four inches of snow wielding three-foot-long plastic mallets in pursuit of a six-inch
rubber ball. Jorgensens team swept a doubleheader, 3 to 2 and 6 to 5.
In 1963, Jorgensens troupe resurfaced as the Unicycle Wranglers, touring county fairs, sporting events,
and circuses. They square-danced, did the jitterbug and the twist, skipped rope, and rode on a high
wire. The group practiced constantly, rehearsing three times a week at Smiths shop and taking dance
classes twice a week. Its like balancing on greased lightning and dancing all at the same time, one
member told the Tribune. When the Ringling Brothers and Barnum & Bailey Circus came to town, the
Wranglers performed under the big top, and in the spring of 1965 they performed in eight local shows of
the Rude Brothers Circus.
Jorgensen was born in 1944 in Chicago to a family of Baptists. His father moved the family to
Albuquerque when Jorgensen and his younger brother, Gordon, were in elementary school. Teds father
took a job as a purchase agent at Sandia Base (todays Sandia National Laboratories), then the largest
nuclear weapons installation in the country, handling the procurement of supplies at the base.
In high school, Jorgensen started dating Jacklyn Gise, a girl two years his junior whose father also
worked at Sandia Base. Their dads knew each other. Her father, Lawrence Preston Gise, known to
friends as Preston and to his family as Pop, ran the local office of the U.S. Atomic Energy Commission,
the federal agency that managed the nuclear weapons program after Harry S Truman took it from the
military following World War II.
Jorgensen was 18 and finishing his senior year in high school when Gise became pregnant. She was a
sophomore. They were in love and decided to get married. Her parents gave them money to fly to
Jurez, Mexico, for a ceremony. A few months later, on July 19, 1963, they repeated their vows at the
Gises house. Because she was underage, both of their mothers signed the application for a marriage
license. The baby was born on Jan. 12, 1964. They named him Jeffrey Preston Jorgensen.
The new parents rented an apartment in Albuquerques Southeast Heights neighborhood. Jackie finished
high school, and during the day, her mother, Mattie, took care of the baby. Life was difficult. Jorgensen
was perpetually broke, and they had only one car, his cream-colored 55 Chevy. Belonging to a unicycle
troupe didnt pay much. The Wranglers divided their fees among all members, with Smith taking a
generous cut off the top. Eventually, Jorgensen got a $1.25-an-hour job at the Globe Department Store,
part of Walgreens (WAG) short-lived foray into the promising discount retail market being pioneered at
the time by Kmart (SHLD) and Wal-Mart. Occasionally Jackie brought the baby to the store to visit.
Their marriage was probably doomed from the start. Jorgensen had a habit of drinking too much and
staying out too late. He was an inattentive dad and husband. Jackies father tried to help him; he paid
his son-in-laws tuition at the University of New Mexico, but Jorgensen dropped out after a few
semesters. Preston Gise then tried to get Jorgensen a job with the New Mexico State Police, but
Jorgensen wasnt interested.
Eventually, Jackie took the child and moved back in with her parents at Sandia. In June 1965, when the
baby was 17 months old, she filed for divorce. The court ordered Ted to pay $40 a month in child
support. Court records indicate that his income at the time was $180 a month. Over the next few years,
he visited his son occasionally but missed many support payments.
Then Jackie took a job working in the bookkeeping department of the Bank of New Mexico and met
Miguel Bezos, who was working the overnight shift while he attended the University of Albuquerque. On
several occasions when Ted was visiting his son, Bezos would be there, and they avoided each other.
But Ted asked around and heard he was a good man.
In 1968, Jackie called Ted and told him she was marrying Miguel and moving to Houston. She told him
he could stop paying child support and asked him not to interfere in their lives. Her father confronted
him and made him promise to stay away. Jackie also wanted to give Jeffrey her new husbands surname
and let Miguel adopt him. Teds permission was needed for the adoption. After thinking it over and
reasoning that the boy would probably have a better life as the son of Jackie and her new husband, Ted
obliged. After a few years, he lost track of the family and then forgot their last name.

If you were to search the world for the polar opposite of sprawling, secretive, powerful Amazon, you
might arrive at a small bike shop in Glendale, Ariz., just north of Phoenix. Its called the Roadrunner Bike
Center. It sits in a shoebox-shape space in an ordinary shopping center next to the Hot Cutz Salon & Spa
and down a ways from a Walmart grocery store. It offers a small selection of premium BMX and dirt
bikes from companies such as Giant, Haro, and Redline, brands that carefully select their retail partners
and generally do not sell to websites or discount outlets. The old guy that runs this is always there and
you can tell he loves to fix and sell bikes, writes one customer in a typically favorable online review of
the store. When you buy from him he will take care of you. He also is the cheapest place I have ever
taken a bike for a service, I think sometimes he runs a special for $30! Thats insane!
A red poster board with the hand-scrawled words, Layaway for the holidays! leans against the window.
Hanging on a wall next to the front counter, theres a framed newspaper clipping with a photograph of a
16-year-old boy with a flattop haircut, standing up on the pedals of his unicycle, one hand on the seat
and the other flared daringly out to the side.
I found Ted Jorgensen, Jeff Bezoss biological father, behind the counter of his bike shop in late 2012. Id
considered a number of ways he might react to my unannounced appearance but gave a very low
probability to the likelihood of what actually happened: He had no idea what I was talking about.
Jorgensen said he didnt know who Jeff Bezos was and was baffled by my suggestion that he was the
father of this famous CEO.
I mentioned Jacklyn Gise and Jeffrey, the son they had during their brief teenage marriage. The old
mans face flushed with recognition. Is he still alive? he asked, not yet fully comprehending.
Your son is one of the most successful men on the planet, I told him. I showed him some Internet
photographs on my smartphone, and for the first time in 45 years, Jorgensen saw his biological son. His
eyes filled with sorrow and disbelief.
I took Jorgensen and his wife, Linda, to a steak dinner, and his story tumbled out. When the Bezos
family moved from Albuquerque to Houston in 1968 and Jorgensen promised Jackie and her father that
he would stay out of their lives, he remained in Albuquerque. He performed with his troupe and took
odd jobs. He drove an ambulance and worked as an installer for Western Electric, a local utility.
In his twenties, he moved to Hollywood to help Smith, the Wranglers manager, start a new bike shop,
and then to Tucson, looking for work. In 1972 he was mugged outside a convenience store after buying
cigarettes. The assailants hit him with a two-by-four and broke his jaw in 10 places.
Ted Jorgensen today
Then he finally started to take control of his life. In 1974 he moved to Phoenix and quit drinking. Six
years later he put together every cent he had and bought the bike shop from its owner. Hes run the
store ever since, moving it several times, eventually settling into its current location on the northern
edge of the Phoenix metropolitan area, adjacent to the New River Mountains. He met Linda at the bike
shop. She stood him up on their first date but showed up the second time he asked her out. Theyve
been married for 25 years. Linda says theyve been talking privately about Jeffrey and replaying Teds
youthful mistakes for years.
Ted has no other children; Linda has four sons from a previous marriage. All are close with their
stepfather, especially the youngest, Darin Fala, who spent the most time with him growing up. But Ted
never told them that he had another child. He says he was sure he would never see or hear anything
about his son again, so what was the point?
Ted is 69 now and has heart problems, emphysema, and an aversion to the idea of retirement. I dont
want to sit at home and rot in front of the television, he says. Hes friendly and, his wife says, deeply
compassionate. The store is less than 30 miles from four Amazon fulfillment centers, but if he ever saw
Bezos on television or read an article about Amazon, he didnt make the connection. I didnt know
where he was, if he had a good job or not, or if he was alive or dead, he says. The face of his child,
frozen in infancy, has been stuck in his mind for nearly half a century.

He says he always wanted to reconnect with Jeffreywhatever his occupation or stationand seems
ashamed that he agreed to stay out of his life all those years ago. I wasnt a good father or a
husband, he says. It was really all my fault. I dont blame Jackie at all.
When I left Ted and his wife after dinner, they were still in shock and decided that they werent going to
tell Lindas sons. The story seemed too far-fetched. But a few months later, in early 2013, I got a phone
call from Fala, a senior project manager at Honeywell (HON) who also lives in Phoenix. Ted, Fala said,
had called a family meeting the previous Saturday afternoon. I bet hes going to tell us he has a son or
daughter out there, Falas wife had guessed correctly.
The gathering was wrenching. My wife calls me unemotional because she has never seen me cry, Fala
says. Ted is the same way. Saturday was the most emotion Ive ever seen out of him, as far as sadness
and regret. It was overwhelming. Ted decided he wanted to reach out to the Bezos family and
reestablish contact and asked Fala to help him craft letters to Bezos and Jackie.
Curious about Bezos, Fala had watched online clips of the Amazon CEO being interviewed, including one
from The Daily Show with Jon Stewart. He was startled to hear Bezoss laugh. Hed heard it before. He
grew up listening to it. He has Teds laugh! Fala said in amazement. Its almost exact.

Homework for Thu Oct 22 (class 10)


Digital debate
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Video
Watch the assigned videos on Zaption and post a one-sentence response to each.
Brokaw: Oliver (7:13)
http://zapt.io/tctavdz7
Bill Maher (4:46)
http://zapt.io/t2uapunp
Online discussion
Respond to this prompt on Canvas:

New
Slate, 2008)

Debate
Can social media solve real-world problems? (Evgeny Morozov vs. Steven Johnson,
Republic, 2013)
Full Text: Keen vs. Weinberger (WSJ, 2007)
The Book Club: True Enough, Entry 1 of 4 (Farhad Manjoo vs. Steven Johnson,

Up for Debate: Can Social Media Solve Real-World Problems?


By Evgeny Morozov and Steven Johnson
In the current issue of The New Republic, Evgeny Morozov offers a critical take on Steven Johnson's
Future Perfect: The Case for Progress in a Networked Age, lamenting the quasi-religion of Internetcentrism. In his response below, Johnson says his book "actually goes out of its way to avoid that kind
of naive techno-determinism." And Morozov, in a rebuttal, concludes that "Johnson doesn't understand
the substance of my critique."
STEVEN JOHNSON:
Anyone worried that Chris Hughes ownership of The New Republic would turn the venerable publication
into a vehicle for Internet boosterism will be delighted to read Evgeny Morozovs new essay, Not By
Memes Alone, running this week in the first official issue of the Hughes reign. Morozovs essay is
ostensibly a ten-page dismantling of my argument for peer progressive politics in Future Perfect, and
like almost everything Morozov writes, its a smart and entertaining piece. He has a very astute riff on
the dangers of what he calls solutionism in my work, and rightly observes that Future Perfect contains
very little discussion of struggle or conflictboth of which strike me as being important critiques of the
book.
Unfortunately, the bulk of the essay is a screed against what Morozov calls the quasi-religion of
Internet-centrism, a movement that wont be content until every institution is reinvented as a
decentralized network fashioned after the Web or Wikipedia. This is not a new theme for Morozov, but
its the first time he has targeted my work as a proponent of this dangerous new faith. I have to admit,
everything that Morozov says about the dangers and limitations of the Internet centrists seems utterly
convincing to me, and if I ever get a chance to meet some of these cultists, I will be sure to persuade
them of the error of their ways. But using Future Perfect as a launchpad to renounce Internet centrism is
a strange choice, since the book actually goes out of its way to avoid that kind of naive technodeterminism. This forces Morozov to do a number of awkward and misleading moves to make the book
sound more doctrinaire than it actually is.
Some of those moves border on factual errors. Start with Morozovs treatment of New Yorks 311
system, which I endorse in the book as a successful example of decentralized peer networks being used
to solve complex social problems. Morozov observes:
But Johnson is completely blind to the virtues of centralization. In discussing 311, he lauds the fact that
tipsters calling the hotline to help create a better macro-level view of city problems. But this is a trivial
insight compared with the main reason why 311 works: Mayor Bloombergs decision to centralizenot
decentralizeprevious models of reporting tips... Johnsons Internet-centric worldview is so biased
toward all things decentralized... that he completely misses the highly centralized nature of 311.
Heres me from the chapter on 311 in Future Perfect:
It should be said that 311 is not a purely decentralized system. There are both literal and figurative
headquarters, where the call center is located. In this sense, it is a hybrid form, somewhere between
the pure peer network and the older state model. The 311 service vastly increases the number of
participants in the system, and gives them the opportunity to set priorities for the citys interventions.
But those interventions are still triggered via a top-down mechanism. To a certain extent, that top-down
element may be inevitable.
I think Morozov may be confused about the meaning of the word completely. Or perhaps this is just
some kind of auto-correct mistake: where he typed completely blind to, he meant to type is perfectly
aware of and openly acknowledges.
In another section, Morozov writes: For all his talk of political philosophy, Johnson makes no effort to
ask even basic philosophical questions. What if some limits to democratic participation in the preWikipedia era were not just a consequence of high communication costs but stemmed from a deliberate
effort to root out populism, prevent cooptation, or protect expert decision-making? And yet half of one
chapter is devoted to the problems with direct democracy, including an extended discussion of the way
the founders framed those problems in the Federalist Papers. Morozov is free to disagree with my
answers, but it is simply incorrect that I make no effort to ask the questions.

But enough about Morozov ignoring my words. The most revealing omission in the review revolves
around his words. Future Perfect has a chapter called What does the Internet want? which Morozov
predictably enough invokes as a telltale sign of Internet centrism:
The totalizers would happily follow Johnson in seeking answers to questions such as So what does the
Internet want?as if the Internet were a living thing with its own agenda and its own rights.
The problem with this diagnosis is that the chapter is explicitly about the difficulty of imagining the
Internet as a unified positive force. It points out that decentralized architectures can be used to build
terrorist networks as readily as crowdfunded charity initiatives. Consider this crucial passage from the
chapter:
Perhaps it was a mistake to treat the Internet as a deterministic one-directional force for either global
liberation or oppression, for cosmopolitanism or xenophobia. The reality is that the Internet will enable
all of these forcesas well as many otherssimultaneously. But as far as laws of the Internet go, this is
all we know. Which of the numerous forces unleashed by the Web will prevail in a particular social and
political context is impossible to tell without first getting a thorough theoretical understanding of that
context.
Youd think that Morozov would want to mention that passage from What Does The Internet Want?"if
only because the words were written by Morozov himself, in his earlier book Net Delusion. I quoted them
at a very prominent early place in the chapter, precisely to make it clear that easy generalizations
about the logic of the Internet were prone to failure. The whole chapter is a meditation on avoiding
the pitfalls of naive tech essentialism; its answer to the question what does the Internet want is: a lot
of contradictory things. But Morozov is so keen to denounce Internet-centrism that he doesnt even
seem to notice when his own words are being invoked enthusiastically as a critique of Internet-centrism.
Now, it would be perfectly reasonable to argue that my critique doesnt go far enough, or that Ive
misinterpreted Morozovs position, or invoked it in bad faith. But instead, Morozov just charges ahead as
if I havent engaged with his argument at all.
The argument that Morozov wants to make here is that we Internet-centrists (a group that apparently
also includes Clay Shirky and Yochai Benkler) begin with our one true devotion to TCP/IP, and then
conveniently backfill a history of lower-tech antecedents in order to justify our love, as Madonna might
say. You wouldnt suspect it from Morozovs review, but the discussion of the Internet makes up only a
fraction of Future Perfects content. He does manage to allude to my section on participatory budgeting
in Brazil, but the book also includes long riffs on the prize-backed challenges offered by the Royal
Society of the Arts in the mid-1700s; the democracy vouchers solution for campaign finance; the
extraordinary rise in aviation safety over the past thirty years; the internal organization of corporations;
childhood malnutrition in Vietnam, and so on.
These stories hail from very different historical and conceptual frames, but they share two important
qualities: they are all directly related to the peer progressive worldview, and they have nothing to do
with the Internet, or computers in general.
I can understand why Morozov wants to see Internet-centrism in my work: Hes built his career around
debunking that belief system, after all. And yes, Im glad the Internet and the Web were invented; I
think that the world is, on the whole, better off for their existence. I would be surprised if Morozov
doesnt feel that way himself. But Future Perfect goes to great lengths to separate the promise of peer
networks from some naive faith in Internet liberation. The main lines of its argument arose in part out of
two book-length studies of peer collaboration in the 18th and 19th centuries: The Ghost Map and The
Invention Of Air. My last book, Where Good Ideas Come From, ended with a survey of hundreds of peerproduced innovations from the Renaissance to today. The deep roots of the idea date back to reading
Jane Jacobs on the organized complexity of the city in my twenties, which ultimately led to my
arguments for decentralization in my 2001 book Emergence. Im giving Morozov the benefit of the
doubt that he just hasnt bothered to read any of those books, since he doesnt mention them anywhere
in the review. But if you added up all the words Ive published on peer network architecture, I wager
somewhere around 90 percent of them are devoted to pre-digital forms of collaboration: in the
commonplace book or the 18th-century coffeehouse, or urban neighborhood formation, or the traditions
of academic peer review, or in the guild systems of Renaissance Florence. If Morozov were only a little
less obsessed with the Internet himself, he might have some very interesting things to say about that
history. Instead, he has decided to reduce that diverse web of influences into a story of single-minded

zealotry. Hes like a vampire slayer that has to keep planting capes and plastic fangs on his victims to
stay in business.
The point I tried to make explicit in Future Perfect is one that Ive been implicitly making for more than a
decade now: that peer collaboration is an ancient tradition, with a history as rich and illustrious as the
more commonly celebrated histories of states or markets. The Internet happens to be the most visible
recent achievement in that tradition, but it is hardly the basis of my worldview. And there is nothing in
Future Perfect (or any of these other works) that claims that decentralized, peer-network approaches
will always outperform top-down approaches. Its simply a question of emphasis. Liberals can still
believe in the power and utility of markets, even if they tend to emphasize big government solutions; all
but the most radical libertarians think that there are some important roles for government in our lives.
Peer progressives are no different. We dont think that everything in modern life should be reengineered to follow the logic of the Internet. We just think that society has long benefited from nonmarket forms of open collaboration, and that theyre arent enough voices in the current political
conversation reminding us of those benefits. For peer progressives, the Internet is a case-study and a
role model, yes, but hardly a deity. We would be making the same argument had the Internet never
been invented.
EVGENY MOROZOV:
In his response, Steven Johnson raises four main objections to my review:
Objection I: Johnson claims that he is not an Internet-centrist because 1) Future Perfect went "out of its
way to avoid ...naive techno-determinism" and 2) one of the book's chapters is about "the difficulty of
imagining the Internet as a unified positive force."
Objection II: Johnson claims that his book is not really about "the Internet," as it also discusses "Royal
Society of the Arts in the mid-1700s; the 'democracy vouchers' solution for campaign finance; the
extraordinary rise in aviation safety over the past thirty years; the internal organization of corporations;
childhood malnutrition in Vietnam"; these stories "have nothing to do with the Internet."
Objection III: Johnson claims that he is, in fact, making an effort to engage with political philosophyas
evidence by his discussion of the limitations of direct democracy.
Objection IV: Johnson claims that I mischaracterize his position on New York's 311 service.
All four objections lead me to conclude that Johnson doesn't understand the substance of my critique.
Let's begin with his sly conflation of Internet-centrism with what he dubs "naive techno-determinism.
As I state in the review's second paragraph, Internet-centrists have no problem acknowledging that the
"Internet" can be deployed to do bad, evil things. The kind of naive determinism that views the
Internet as a "positive force" and that Johnson seeks to distance himself from has nothing to do with
Internet-centrism; it's a feature more commonly attributed to cyber-utopianism, as I clearly state at the
very beginning of the review. That Johnson is not a starry-eyed techno-determinist doesn't make him
less of an Internet-centrist.
What should we make of Johnson's questionon Page 120of What does the Internet want? It's a
question that he derives from Kevin Kelly's question - What does technology want?; both Kelly and
Johnson assume that there is some kind of neat intellectual and practical coherence to these two ideas
a view that I vehemently oppose. This question does allow us to make the utopian/centrist distinction
even sharper: An Internet-centrist asks the question: What does the Internet want? as if that question
made sense. An Internet-utopian doesn't even ask that question, assuming that the Internet wants
democracy and freedom. I don't know if Johnson is an Internet utopian but he is certainly an Internetcentrist. So while it's clear Objection I doesn't stand, let us still examine Johnson's answer:
So what does the Internet want? It wants to lower the costs for creating and sharing information. The
notion sounds unimpeachable when you phrase it like that, until you realize all the strange places that
kind of affordance ultimately leads to. The Internet wants to breed algorithms that can execute
thousands of financial transactions per minute, and it wants to disseminate the #occupywallstreet
meme across the planet. The Internet wants both the Wall Street tycoons and the popular insurrection
at its feet.

I leave it to the reader to decide if this passage implies that there's a certain logic to the Internet; my
reading is that Johnson does believe this while also arguing that the exact manifestations of that logic
would be different in each and every contextwhich, if one closely looks at the Shirky quote about the
logic of the Internet that I mention in the review, is very much in line with Internet-centric thinking.
But it might be useful to step back and ask whether the very fact of bringing the Internet in our
explanatory accounts is enhancing or impoverishing our understanding of the technological world that
we inhabit. Are we gaining anything by lumping the algorithms used in high-frequency trading with a
very different set of algorithms that Twitter uses to decide on its popular trends while using the sexy
but highly elusive label of the Internet to do all that lumping? I don't think sowhich is why I've been
calling for a highly particularized approach to studying digital technologiesone that would treat each
of them on their own terms without having to smuggle in some abstract, macro-level concept such as
the Internet to smooth over the rough empirical and theoretical edges (As I point out in one of the
footnotes, game theorist Ian Bogost has a related but clunky term for this methodhe calls it media
microecology).
Second, I'm well aware that Johnson sees the same spirit of peer progressivism that he believes to be
at work in the Internet also sweeping through modern-day Vietnam, through Britain of the mid-1700s
and through half a dozen other non-Internet contexts. But this is exactly what I have criticized him for!
This tendency to travel back in time and rummage through other contexts and eras in search of some
imaginary proto-Internetwhich can then be repackaged in a sexy ideology like peer progressivism
is precisely what I find so problematic about Johnson's work in particular and Internet-centrism in
general. (As I put it in my review: Once the elusive logic of the Internet has been located, it is not
uncommon to see Internet-centrists move to deflate its actual novelty.). One cannot refute an
accusation of Internet-centrism by proclaiming one's adherence to one of its key principles!
To understand the role that the notion of the Internet plays in Johnson's argument, consider a simple
thought experiment. Remove the Internet and all its affiliated projects, from Kickstarter to Wikipedia,
from a long list of examples of Johnson's peer progressivism from childhood malnutrition in modernday Vietnam to the hurdles of navigation in 18th century Britainand see how far you'll go in
convincing your readers that these examples amount to an original political philosophyso original, in
fact, that it deserves the fancy label of peer progressivism. If a book does come out of this, my bet is
that you'll have to self-publish. Virtually every political idea that Johnson articulates in Future Perfect
has been with us for decadesand it's precisely the vague, lazy and innovation-obsessed culture of our
Internet debate that lets Johnson get away with inventing an original theory without doing his
homework.
What allows Johnson to cut so many intellectual corners is his ability to capitalize on everyone's
excitement about the Internet. He does so by selecting those pieces from its rich and ambiguous
history that fit his overall peer progressive narrative while turning non-Internet history into a fishing
expedition that supplies intellectual gravitas to the carefully selected Internet anecdotes that define
what peer progressivism is all about. No wonder all his historical connections make sense: his
argument is designed that way! On these grounds, I suspect that Objection II doesn't stand either.
As for Johnson's peeve that I unfairly attack him for not engaging with political philosophy, look no
further than peer progressivism. What, one might ask, is new about this political ideology? According
to Johnson, at least two things. First, its adherents believe that there are some areas of expertise where
the publicor the crowdare more knowledgeable than the experts. Second, peer progressives
unlike all those pre-peer progressivesdon't have to choose between the state and the market; the two
can co-exist, tapping into networks of crowd expertise along the way.
My problem lies not so much with the thrust of these two propositions; both are quite sensible. Rather,
my problem is with the manner in which Johnson arrives at them, the fuzzy language that he deploys in
the process, the revolutionary novelty that he ascribes to his own insights, and the carelessness with
which he treats decades of serious thinking on this subject. Johnson, comfortably ensconced in his
Internet-centric bubble, seems to sincerely believe that no one had ever thought about ways to make
democratic politics more participatory before the onset of blogs, chats, and social networks. This, of
course, is nonsense. The most unfortunate consequence of Johnson's project might be that, in his halfbaked efforts to make a case for peer progressivism, he might undermine public support for more
serious government reforms that are not as excited about the Internet but have developed

sophisticated theories about involving crowds and networks in both deliberative and participatory
processes.
So what does Johnson omit? Quite a few things, in fact. The idea that progressive politics can be
combined with market-oriented and decentralized solutions was already in circulationin the writings
of Samuel Bowles and Herbert Gintis but also of Joshua Cohen and Charles Sabelby the end of the
last century (and, more recently, in the work of scholars like Archon Fung). Here are, for example, Cohen
and Sabelwriting in 1997 -- on how governments can become more participatory and profit from more
decentralized ways of knowledge aggregation: Instead of seeking to solve problems, the agencies
[would] see their task as reducing the costs of information faced by different problem-solvers: helping
them to determine which deliberative bodies are similarly situated, what projects those bodies are
pursuing, and what modifications of those projects might be needed under local conditions...Here is
Cohen in another essay written at the time:The availability of alternative methods of problem-solving
imposes on legislatures a greater burden in justifying their own direct efforts: They must explicitly make
the case that the benefits of those efforts suffice to overcome the advantages of direct-deliberative
solutions.
These two quoteswritten a good decade before buzzwords like open government hijacked the public
conversationpack more reform wisdom than Johnson's entire book. But Johnson prefers to ignore
virtually everything written on this subjectthe faux novelty of the Internet licenses him to such
frivolity. So he completely ignores Josiah Ober, who has made a fascinating use of Hayek, game theory
and political philosophy to argue that the democracy of classical Athens was so effective because it
deployed highly innovative and decentralized schemes of aggregating the knowledge of its citizens. Nor
does he mention a new strand of scholarship on the political implications of cognitive diversitybest
exemplified by the work of Jon Elster and Helene Landemorewhich has advanced sophisticated,
context-sensitive arguments about ways to bring more diverse voices into democratic policy-making. All
these efforts start from where reform proposals ought to start: they acknowledge the complexity of the
problem that they are trying to tackle and only then do they work their way to their preferred solution.
This is not the case with Johnson, who starts with his preferred solutionthe Internetand then
searches for problems that it can help him solve. Yes, it's nice to see him quote the Federalist Papers but
I hope he's at least aware that this is hardly the latest word on innovations in participatory governance.
So Proposition III doesn't stand either.
Finally, did I misread Johnson's treatment of 311? I don't think so. My claim is that the reason why
Johnson is fascinated by 311 is because he views it through the Internet-centric lens of Wikipedia and
other seminal Internet projects. That lens assumes an argument that goes something like this:
encyclopedias used to be centralized and now they are decentralized. Likewise, tip-reporting systems
used to be centralized and now they are decentralized, with hierarchies being replaced by peer
networks. But is it actually true? Were tip-reporting systems ever centralized? Was there ever a
competent Big Brother, perhaps in the form of some omniscient inspectorthat proverbial expert
hated by Internet-centristswho was tasked with tracking all of New York's problems and who now,
thankfully, has been replaced by the crowds? Perhaps, there was such an expert long time ago but
involving crowds in the process of reporting incidents and crimes has a very long history that is not very
relevant to the 311 project. The 311 project was not about replacing centralized experts with
decentralized crowds; it was about turning a slew of previously decentralized tip-reporting systems and
hotlinessystems that already relied on crowdsinto one highly centralized system.
Is this what Johnson means when he writes that 311 is not a purely decentralized system and that
top-down element may be inevitable? No, it isn't: what he means here is that while under 311, the
inputsthe tipsmight still be coming from decentralized sources, it still takes a centralized system
some city agencyto deal with the reported problem. But it would be silly to think otherwisenot
unless we expect the New Yorkers to bypass various city agencies and fix problems on their own.
Johnson completely misses what's novel about the story he is discussing that the 311 hotline works
because it centralized many different hotlines under one roofand focuses on that part of the story
which fits his Internet-centric view of peer progressivismnamely that 311 works because many
people report tips to it in much the same way that many people edit Wikipedia. But to believe this is to
miss the fact many people were already reporting tips to New York's various hotline systems even
before 311! Thus, I don't think that Objection IV should be allowed to stand either.
Now, there's one point I must concede to Johnson. I fully agree with him when he writes that "the point I
tried to make explicit in Future Perfect is one that Ive been implicitly making for more than a decade

now." This hasn't escaped my attention; the original version of my review ran at 8000 words and
contained a long section situating Future Perfect in Johnson's entire oeuvre a section that had to be
cut for space reasons. (Given that I managed to keep a reference to one of his little-known essays from
2005, it is a bit unfair to accuse me of not being familiar with his work.) But I do agree with the thrust of
Johnson's remarks: he has, in fact, managed to write yet another book about the same subjectlet's
just call it buzz this time invoking the notion of the "Internet" to justify the publication. In fact, a
close analysis of the source material for Future Perfect reveals that it's based on many essays and blog
posts that Johnson had penned before the idea of peer progressivism took hold of his imagination.
This is the same trick Johnson pulled with his turn to "innovation" in his previous book, Where Good
Ideas Come From, and with neuroscience and sociobiology in his earlier works. Fortunately, we know
how Johnson goes about deciding what specific intellectual form to give to this buzz in his future
projects. Emphasizing the useful feedback that Johnson got from speaking to the clients booked through
his agency, Bill Leigh, his speaking agent, recently told New York magazine that Johnson wanted to
take his book sales to the next level...Out of those conversations [with clients] came his decision to
slant his material with a particular innovation feel to it. Where good ideas come from still remains a
mystery; where lucrative ideas come from everybody knows. It's surprising that it has taken Johnson so
long to discover one such lucrative idea in the Internet.
Full Text: Keen vs. Weinberger
This is the full text of a "Reply All" debate on Web 2.0 between authors Andrew Keen and David
Weinberger.

Mr. Keen begins: So what, exactly, is Web 2.0? It is the radical democratization of media which is
enabling anyone to publish anything on the Internet. Mainstream media's traditional audience has
become Web 2.0's empowered author. Web 2.0 transforms all of us -- from 90-year-old grandmothers to
eight-year-old third graders -- into digital writers, music artists, movie makers and journalists. Web 2.0 is
YouTube, the blogosphere, Wikipedia, MySpace or Facebook. Web 2.0 is YOU! (Time Magazine's Person of
the Year for 2006).
Is Web 2.0 a dream or a nightmare? Is it a remix of Disney's "Cinderella" or of Kafka's "Metamorphosis"?
Have we -- as empowered conversationalists in the global citizen media community -- woken up with the
golden slipper of our ugly sister (aka: mainstream media) on our dainty little foot? Or have we -- as
authors-formerly-know-as-the-audience -- woken up as giant cockroaches doomed to eternally stare at
our hideous selves in the mirror of Web 2.0?
Silicon Valley, of course, interprets Web 2.0 as Disney rather than Kafka. After all, as the sales and
marketing architects of this great democratization argue, what could be wrong with a radically flattened
media? Isn't it dreamy that we can all now publish ourselves, that we each possess digital versions of
Johannes Gutenberg's printing press, that we are now able to easily create, distribute and sell our
content on the Internet? This is personal liberation with an early 21st Century twist -- a mash-up of the
countercultural Sixties, the free market idealism of the Eighties, and the technological determinism and
consumer-centricity of the Nineties. The people have finally spoken. The media has become their
message and the people are self-broadcasting this message of emancipation on their 70 million blogs,
their hundreds of millions of YouTube videos, their MySpace pages and their Wikipedia entries.
Yes, the people have finally spoken. And spoken. And spoken.
Now they won't shut up. The problem is that YOU! have forgotten how to listen, how to read, how to
watch. Thus, the meteoric rise of Web 2.0's free citizen media is mirrored by the equally steep decline in
paid mainstream media and the mass redundancies amongst journalists, editors, recording engineers,
cameramen and talent agents. Newspapers and the music business are in structural crisis, Hollywood
and the publishing business aren't far behind. We've lost truth and interest in the objectivity of

mainstream media because of our self-infatuation with the subjectivity of our own messages. It's what,
in "Cult of the Amateur," I call digital narcissism. A flattened media is a personalized, chaotic media
without that the essential epistemological anchor of truth. The impartiality of the authoritative,
accountable expert is replaced by murkiness of the anonymous amateur. When everyone claims to be
an author, there can be no art, no reliable information, no audience.
Everything becomes miscellaneous. And miscellany is a euphemism for anarchy.
That's the dark side of the Web 2.0 story, more Kafka than Disney. While we are all busy embracing our
inner user-generated-content, the world -- real life rather than Second Life -- is passing us by. This is
infantilized self-stimulation rather than serious media for adults. Web 2.0's democratization of
information and entertainment is creating a generation of media illiterates. That's the nightmare. And
it's easy to see. Just go online and look at YouTube, the blogosphere, Wikipedia, MySpace or Facebook.

Mr. Weinberger responds: You're right. The Web is a problem. It has been from the beginning and it
always will be.
But your dichotomy is false. The Web isn't Cinderella facing Gregor "The Cockroach" Samsa in a
deathmatch. Despite Time -- which, as a pillar of the mainstream press is of course free of the
hyperbole so common on the Web -- the Web isn't even You. It's us. And that is the problem.
Your wildly unflattering picture of life on the Web could also be painted of life before the Web. People
chatter endlessly. They believe the most appalling things. They express prejudices that would peel the
paint off a park bench. They waste their time watching endless hours of TV, wear jerseys as if they were
members of the local sports team, are fooled by politicians who don't even lie convincingly, can't find
Mexico on a map, and don't believe humans once ran with the dinosaurs. So, Andrew, you join a long list
of those who predict the decline of civilization and pin the blame on the latest popular medium, except
this time it's not comic books, TV, or shock jock radio. It's the Web.
This time, of course, you might be right...especially since you and I seem to agree that the Web isn't yet
another medium. Something important and different is going on.
We also agree that the Web is a problem. The problem endemic to the Web even before anyone gave
the Web version numbers -- and the problem that leads to your issue with "cockroaches" -- is that
because anyone can contribute and because there are no centralized gatekeepers, there's too much
stuff and too many voices, most of which any one person has no interest in. But, the Web is also the
continuing struggle to deal with that problem. From the most basic tools of the early Internet, starting
with UseNet discussion threads, through Wikipedia, and sites that enable users to tag online resources,
the Web invents ways to pull together ideas and information, finding the connections and relationships
that keep the "miscellaneous" from staying that way.
But, why should we trust the way "monkeys" (as you refer to Web users in your book) connect the
pieces? We shouldn't trust them blindly. Open up The Britannica at random and you're far more likely to
find reliable knowledge than if you were to open up the Web at random. That's why we don't open up
the Web at random. Instead, we rely upon a wide range of trust mechanisms, appropriate to their
domain, to guide us. Amazon gives you ways of checking to see if a particular reviewer is trustworthy ,
but the mechanisms are not particularly rigorous because not all that much is at stake when considering
the 6,001st review of a Harry Potter book. At eBay, where your money is at risk, the trust mechanisms
are more reliable. On a blog, the persistence of previous posts means you can read further to see if you
trust the blogger. More important, the recommendation of other bloggers you already trust is a good
indicator. At Wikipedia, the rather sophisticated governance processes help establish trust, as does the
complete transparency of the discussions behind the articles. On mailing lists, we learn over time who's
a blowhard and who's a source of knowledge even if we don't know what her real name is. These
examples are not exceptions. They are the rule and they have been from the beginning, because from
the beginning the Web has been about inventing ways to make its own massness -- its
miscellaneousness -- useful.

Compare that to the previous generation of media. The traditional media are not Cinderella to the Web's
cockroach, and not just because the traditional media have their own cockroaches. The Web is far
better understood as providing more of everything: More slander, more honor. More porn, more love.
More ideas, more distractions. More lies, more truth. More experts, more professionals. The Web is
abundance, while the old media are premised -- in their model of knowledge as well as in their
economics -- on scarcity.
Amateurs aren't driving out the pros, Andrew. The old media are available on line. If some falter, other
credentialed experts will emerge. But the criteria governing our choice of whom to listen to are
expanding from "Those are the only channels I get" and "I read it in a book" to "I've heard this person
respond intelligently when challenged," "People I respect recommend her," and even "A mob finds this
person amusing." This is the new media literary, suited to the new abundance.
Will we choose wisely? Compared to what? We are never going to be a species of Solons, moved only by
higher thoughts and the finer emotions. But the history of the Web so far says that we are highly
motivated to come up with ways to make sense of a world richer and more interesting than the
constrained resources of the traditional media let on.
So, Andrew, a question for you. You bemoan the loss of "the essential epistemological anchor of truth"
and the "impartiality of the authoritative, accountable expert." It's easy to agree with that when it
comes to facts, the sort of stuff we consult almanacs for. But when it comes to the more important and
harder issues, where we want to understand our world -- science, politics, the arts -- are you quite as
comfortable with the notion that there are identifiable epistemological anchors? Or is your epistemology
in fact rooted in the scarcity that has silently shaped the traditional media?

Mr. Keen: I agree that the Web is us. It's a mirror rather than a medium. When we go online, we are
watching ourselves. So the question is do we want to be looking at ourselves as our best (Cinderella) or
our worst (the giant cockroach)? My point is that what appears to the Web 2.0 crowd to be a Disney
production is actually a Kafka remix.
You are right that people have always chattered endlessly about the silliest things. But the selfpublishing Internet is the greatest of great seduction. Web 2.0 tells us that we all have something
interesting to say and that we should broadcast it to the world. As I argue in my book, Web 2.0
transforms us into monkeys. :-) That's the new abundancy, the long tail, if you like. Infinite primates
with infinite messages on infinite channels. The only good news is that broadband is still pathetically
slow. But what happens when fiber-to-the-home becomes a reality for all of us? What happens with the
monkeys have the technology of the Gods at their paw tips? Media will be transformed into ubiquitous
chatter -- into an audio-video version of Twitter.
Yes, the web does represent an abundancy of everything -- "more porn, more love, more ideas, more
distractions." This is fascinating to a philosopher of knowledge like yourself, but for mere mortals who
rely on their media to "understand the world", new digital abundance will lead to intellectual poverty.
The more we know, the less we will know. You see, to use this chaotic media efficaciously, we need to
invent our own taxonomies -- which isn't realistic for the majority of ordinary people (seeking to
understand the world) who think a "taxonomy" is something that drives us to the airport.
Meanwhile, traditional scarcity is getting scarcer. We've always had a scarcity of seriousness, of talent,
of the artist/intellectual able to monetize their expertise. As you know better than most, it's hard work
thinking up, writing, selling and then marketing a good book (both "Cluetrain" and "Everything is
Miscellaneous" are really good, albeit wrong). Traditional media has done a good job in discovering,
polishing and distributing that talent. But once everything is flattened, when books are digitalized, when
libraries become adjuncts of Google, when writers are transformed into sales and marketing reps of
their own brands, then what?
Which brings me to back to your question about epistemological anchors. I value people like yourself
who are able to package up interesting arguments in a physical product which has monetary exchange
value. You do a great job helping your reader understanding their world and they do a great job buying

your book, thereby allowing you to pay your mortgage and write more books thereby helping more
people understand their world. My concern is that this scarcity, the scarcity of the intellectual authority
able to help people understand the world, is indeed endangered -- particularly if the physical book goes
the way of the physical CD and the physical newspaper. So let me end with a question to your question.
Are you convinced that Web 2.0 is of benefit to traditional intellectuals like yourself? Are you confident
that, in a flattened media in which authors give away their books for free and collect their revenue on
the back-end, the David Weinberger 2.0 of the future will flourish (or even survive)?

Mr. Weinberger: When you claim the Web is a "Kafka remix," you can't mean that everything on the Web
is bad, if only because, well, you have your own blog, which is good but wrong. :) So, you must mean
that the preponderance of what's on the Web is bad, as bad as cockroaches. And, as I said, I suspect
you're right. That'd be a problem if we had no way of locating what's of value. But we do. Lots of ways.
More ways every day, as I described earlier.
Rather than re-treading that ground, let's talk about the nature of talent, and why you see monkeys and
cockroaches everywhere you look on the Web.
You and I agree that genuine talent is scarce and needs nurturing. But your picture of talent is formed
by the binary view the traditional media have forced on us. Because it's been so expensive to produce,
market and distribute cultural products (books, records, films), the lucky few who get published get
access to a mass audience, and the rest trail off the map. So, traditional distribution makes it look like
talent is a you-got- it-or-you-don't proposition -- you're an artist or you're a monkey. That doesn't reflect
the scarcity of talent so much as the scarcity of distribution, a result of the high cost of delivering the
first copy of a mass-produced item.
In fact, we have every reason to believe that talent is distributed in a far smoother (but still steep)
curve. My friend Joe is an amazing guitarist, but he's not the best guitarist around. Neither is my sisterin-law Maria the best singer in the world, but she's good and you would spend an enjoyable, and
sometimes moving, night listening to her in the local chorus. Talent is not either/or. Recording contracts
are.
With the Web, we can still listen to the world's greatest, but we can find others who touch us even
though their technique isn't perfect.
Note the "we can find." We couldn't if finding required creating our own taxonomies, as you say. Instead,
we rely on (1) Taxonomies created by experts (newspapers that categorize their stories, stores that
categorize their offerings); (2) Computer-assisted ways of locating what's relevant (search engines); and
(3) Recommendations made by people we trust. We're getting better at all of these. It's where some of
the (4) most surprising innovation is occurring.
I certainly do agree with your concerns about how we're going to pay talent. I don't have any answers or
predictions, but I suspect every institution whose value rides on the scarcity of information or the
difficulty of distributing it will face this issue eventually. And those are some institutions we both care a
lot about. There are whole classes of professionals who may find themselves without work. That's a
frightening prospect. (On the other hand, delivering this value on the Web is a business opportunity, so
it would be premature to declare defeat.)
We will lose some talent. We'll gain some that otherwise would have been left behind by the binary
selection process in the real world. Of those, a few will be world-class. Many will make the world only
somewhat better. And some will be screeching, violin-playing monkeys whose efforts we will flee from.
But that raises one other myth that I think runs under your comments. You say "the intellectual
authority able to help people understand the world is indeed endangered." Then you ask if I'm
convinced that the Web benefits intellectuals. Yes, I am. And that's because, while some talent is indeed
solitary, many types of talent prosper in connection with others. That is especially true for the
development of ideas. Knowledge is generally not a game for one. It is and always has been a
collaborative process. And it is a process, not as settled, sure, and knowable by authorities as it would

be comforting to believe. So forget my homey examples of Joe and Maria. Consider how much more we
know about the world because we have bloggers everywhere. They may not be journalists, but they are
sources, and sometimes they are witnesses in the best sense. We know and understand more because
of these voices than we did when we had to rely on a single professional reporting live at 7.
I was an academic a long time ago, Andrew, but I haven't forgotten how isolated I felt in the
philosophical community before the Web. Ideas were scarce back then because space, time and the
limitations of paper made it hard to hear what others were saying and well nigh impossible to talk with
them about it. Today I am in contact with people who come up with ideas I'd never have encountered,
who are sources of wide expertise, who squirrel away in public on tiny topics, who spew a long tail of
speculations with occasional insights that are worth the wait, who take me apart because my logic is
wrong or my biases are showing or my grammar has gone screwy, who support my good ideas and just
let the bad ones pass. Without any doubt, I am in the richest, most stimulating, most fruitful swirl of
thought, knowledge, ideas and feeling ever in my life...far more productive than when I was consigned
to talking only with professionals and credentialed experts. This is fundamental to my experience of the
Web, just as monkeys and cockroaches are to yours.
Andrew, maybe you just ought to find some better blogs to read. :)
Now a question: For academics, scientists and serious intellectuals, do you think the Web is nothing but
a disaster? In fact, since businesses learned long ago that knowledge is social, do you seriously
maintain that the work of business -- I'm not here thinking of ecommerce -- can only be degraded by
being done on the Web?
(As for "David Weinberger 2.0," I appreciate your confidence, but I'm still in beta.)

Mr. Keen: I can be as cocky a cockroach as anyone, thus my blog has gigantic insect footprints all over
it. :-)
I agree wholeheartedly with your comments about the online academic community. Any medium which
brings experts and professional authorities together is healthy. I am thrilled that you've discovered such
a rich intellectual community online. If this is Web 2.0, then I love Web 2.0. I'm a Cluetrainer when it
comes to serious people conversing fruitfully on the Internet. The problem, however, with Web 2.0 is
that most of the conversation seems to be taking place anonymously, conducted -- in a manner of
speaking -- by people who are more interested in vulgar insult than respectful intellectual intercourse.
The comments sections of most major website are littered with this trash. As is the blogosphere. So,
yes, the Internet is great for experts to discover one another and conduct responsible conversation. It's
the monkey chorus on the democratized web that bother me.
The issue of talent is the heart of the matter. How do we traditionally constitute/nurture/sell talent and
how is Web 2.0 altering this? My biggest concern with Web 2.0 is the critique of mainstream media that,
implicitly or otherwise, drives its agenda. It's the idea that mainstream media is a racket run by
gatekeepers protecting the interests of a small, privileged group of people. Thus, by flattening media,
by doing away with the gatekeepers, Web 2.0 is righting cultural injustice and offering people like your
friends Joe and Maria an opportunity to monetize their talent. But the problem is that gatekeepers -- the
agents, editors, recording engineers -- these are the very engineers of talent. Web 2.0's
distintermediated media unstitches the ecosystem that has historically nurtured talent. Web 2.0
misunderstands and romanticizes talent. It's not about the individual -- it's about the media ecosystem.
Writers are only as good as their agents and editors. Movie directors are only as good as their studios
and producers.
These professional intermediaries are the arbiters of good taste and critical judgment. It we flatten
media and allow it be determined exclusively by the market, then your friends Joe and Marie have even
less chance of being rewarded for their talent. Not only will they be expected to produce high quality
music, but -- in the Web 2.0 long tail economy -- they'll be responsible for the distribution of their
content. No, if Joe and Maria want to be professional musicians paid for their work, they need a label to
make an either/or call about their talent. That's the binary logic that informs any market decision -- from

music to any other consumer product. Either they can produce music which has commercial value or
they can't. If they can't, they should keep their day jobs. If they can produce commercially viable music,
Joe and Maria need the management of professionals trained in the development of musical talent.
I respect your attempt to escape from the either/or realities of market economics. But I'm afraid this is
the binary logic of life. The culture business is ugly. It rewards talent and punishes those who don't have
it. The democratization of talent is a contradiction in terms. Even part-time cockroaches like me know
that.

Mr. Weinberger: Yes, let's talk about talent.


The people who make my life on the Web so positive intellectually include a brilliant but crazy college
drop-out, a practicing medical doctor who is interested in information theory on the side, a struggling
working mom who has a keen eye for bulls---, a theologian at a tiny seminary I'd never heard of, and a
guy I know nothing about but who on a mailing list for five years has explained in detail the implications
of FCC rulings. Most of these people would not, could not, or did not make it through the traditional
credentialing and publishing systems in the areas they're writing about. They are not "experts and
professional authorities," but I'd be a fool to ignore this "talent" just because many of them are
"amateurs." Likewise, our culture overall would be foolish to stick within the safe boundaries of the old
credentialing system...
Especially since the old talent system, the fate of which you bemoan actually doesn't work the way you
say it does, and does not yield the results you claim for it. The mainstream media's business model
does not aim at nurturing talent. It aims at moving units. It therefore does exactly what you complain
the Web does: It panders to the market. If you want to see the "democratization" of talent you fear, just
look at a Top 40 chart. There are bright spots, but you seem to have confused the mainstream media's
handling of artists with apprenticing in Michelangelo's studio.
In fact, your assessment that "agents, editors, recording engineers" are "the very engineers of talent,"
betrays just how deeply you've drunk the mainstream Kool-Aid. Talent isn't engineered. Hits are. Even in
the best of circumstances, (and judging from the acknowledgments in both our books, you and I have
been in these best of circumstances), agents, editors and engineers come at the end of the nurturing
process. A musician -- especially of the "refined" sort you prefer -- owes more to her teachers, parents
and performing partners than to her agent and engineer. That's where the nurturing of talent occurred.
Academics are an especially bad example for you, Andrew, since making it through the scholarly
publishing gate brings them so little money and generally very little attention from editors.
The question, therefore, is not whether the traditional media's taste is better or worse than the Web's.
The Web doesn't have taste, good or bad. The Web is not an institution, a business, or even a market,
any more than the real world is. It's us. We have lots of different tastes. On the Web we can better fulfill
those tastes (because of the Long Tail you ridicule in your book), rather than simply relying on others to
decide for us what is worth attending to.
But, that's not the whole story. When I say the Web is us, I don't mean that it's an aggregation of
individuals -- a herd of screeching monkeys or a scurry of voiceless cockroaches running from the light.
We're connected, primarily through talk in which we show one another what we find interesting in the
world. That's essential to the Web. The Web is only a web because we're building links that say "Here's
something worth your time, and here's why." It's a little act of selflessness in which a person who has
our attention directs it elsewhere. (That's why your polemical use of the term "monkey" is not only
intentionally obnoxious but essentially false and misleading.)
There is therefore hope here that in the midst of the ever-present low culture, we will together educate
our tastes, seeing more of the world than the traditional media could ever show us, and learn to
appreciate it. Included in this hope is, of course, the fact that the traditional gatekeepers are
themselves online, telling us what is worth attending to and why. Now their influence depends on how
convincing and articulate they are, not on their control over the on-off switch on the broadcast tower or

printing press. That is, the gate keeping goes from dictating what we can read to telling us what we
ought to read.
Now let me address the other sort of nurturing to which you refer: the sort that money brings. Money is
important.
Allow me to switch to PowerPoint mode, for brevity:
You overstate the rosiness of the current situation for artists, scholars and other creators. Very
few make a living through the traditional media.
Lots of creative people are making money on the Web, including traditional, edited, gate-kept
media.
It's way too early to declare that artists will not be financially supported on the Web. We are at
the beginning of a painful transition. We're not yet done inventing.
It may well be that the Web results in fewer mega-stars. But it may also become an important
addition to the real "business model" of most artists and creators, providing more listeners who
will not only download their creations but perhaps come to their performances. The Web is
actually additive for most creators.
We will also have more terrible "artworks." So what? We should ignore them just as we skip over most
channels on TV. Except we're far more sophisticated in how we travel the Web than we are when using
the sequential clicking of a TV remote. On the Web we'll continue to invent ways to find what matters to
us.
Of course we will, because "mattering" is the real driver of the Web.

Mr. Keen: So I did what you suggested. I took a look at the New York Times best-seller list. The top six
non-fiction hardback books for the week of June 10 were:
1. "The Assault on Reason" by Al Gore
2. "The Reagan Diaries" by Ronald Reagan
3. "Einstein" by Walter Isaacson
4. "God is not Great" by Christopher Hitchens
5. "Presidential Courage" by Michael Bechloss
6. "A Long Way Gone" by Ishmael Beah
None of these books seem to be "engineered" hits. Gore as #1 and Reagan as #2 collectively disprove
the right/left wing critique of big media as a right/left wing racket. A strong marketing and sales effort
on behalf of a 700-page, $32 biography of Albert Einstein seems to me like a noble achievement on the
part of big media to bring science to the people. Equally noble is their commitment to Beah's book, a
searing narrative about the contemporary African tragedy and the power of personal redemption. As a
wannabe Hitchens myself, I'm a big fan of an anti-populist polemic which goes against the beliefs of the
majority of God-fearing Americans. Gore/Reagan/Isaacson/Hitchens/Bechloss/Beah are all talented
authors who have written original and important books that require the marketing and sales muscle of
mainstream media to be broadly distributed. And even if these hits are "engineered" by big media -- so
what? Indeed, I applaud the engineering of books about critically important subjects in politics, history
and theology. I want my kids reading the awful truth about life in Africa. I want them to get mugged by
Hitchens on the question of God's (non)existence. I want them to attempt to digest a 675 page
biography about Einstein.
You say that mainstream media's only goal is "moving units" which "pander to the market." But surely
those supposedly nefarious fellows who run our publishing houses could come up with easier way of
moving units than 675-page biographies of a German physicist or a 308-page civics lecture by Al Gore?
No. The truth is that the editors in charge of America's publishing industry value high quality books. And
the reading public obviously values these texts too. The wisdom of the literate crowd is reflected in the
New York Times list. Book readers are smarter than you think. We the audience don't want to read crap.

Then I went to Technorati to look at the six most popular blogs for the same week. This, to borrow your
language, is what "matters" in the world of Web 2.0:
1. Engadget
2. Boing Boing: A Directory of Wonderful Things
3. TechCrunch
4. Gizmodo
5. The Huffington Post
6. Lifehacker, the Productivity and Software Guide
David, you say we have "lots of different tastes," but it seems like the hits on blogosphere are much less
intellectually diverse than the hits on the New York Times book list. Engadget and Gizmodo are blogs
about new technology gear -- iPods, BlackBerries, iPhones etc. TechCrunch and Lifehacker are geeky
technological blogs for technology geeks. The Huffington Post is, I admit, a valuable read -- although it
seems to me to becoming more like a traditionally authoritative newspaper than an unedited blog.
Meanwhile, Boing Boing is a surreal and supremely inane compendium of miscellaneous knowledge -listing stories about kidney donor hoaxes, a pedagogical tract on "How to Kiss" and, a game-theory
piece entitled "an economic analysis of leaving the toilet seat down."
I respect your faith in the miscellany of knowledge, but I worry that it's you, in fact, who have sipped the
Kool-Aid. I want my kids reading Reagan and Gore rather than how-to articles about kissing. I fear that
the overall consequence of the democratized blogosphere is akin to leaving the toilet seat down. But
this isn't a game and it isn't theory. Sites like Boing Boing are flushing away valuable culture. Rather
than a directory of wonderful things, Web 2.0 is a miasma of trivia and irrelevance. It doesn't matter.

Mr. Weinberger concludes: Actually, I'd suggested you take a look at the Top 40 songs. Of course you're
within your rights to cite the New York Times best-sellers list instead, but that's indicative of the
problem with your method. Are you seriously maintaining that pop culture off line is represented by six
good books on the New York Times hardcover non-fiction list? Why do you find it so awkward to
acknowledge the obvious point that the gatekeepers of commercial publishing and production -- the
producers of TV shows, magazines, pop music, movies, books -- are usually driven not by high cultural
standards, but by the need to reach a broad audience? Do I need to remind you that "The Secret" is
likely ultimately to outsell all six of those worthy books combined?
We could argue over the value of the six top blogs versus their analogues in the traditional media. I
could point out that those blogs are not the work of amateurs, but are profitable businesses run by
experts. I could even rise to your BoingBoing bait, as if that site needed my defense against your
selective reading. But, all that would miss the real point: The Web is not mass culture, so we can't just
look at the most popular sites to see what's going on. Most of the action is in the long tail of users, sites
with just a handful of links going to them. So, pointing to the "short head" of highly popular sites not
only tells us little, it views the Web through a distorting lens, as if sites were read-only publications
rather than part of a web of conversations.
Andrew, the mud you throw obscures the issues you raise. Porn sites, silly posts, monkeys, cockroaches,
toilet seats. This rhetoric isn't helpful. In fact, in your attempt to be controversial, you're playing into the
hands of political and economic forces that would like the Internet to be nothing more than an extension
of the mass media. If your book succeeds on the best-seller lists, but contributes to the Web becoming
as safe, narrow, controlled and professional as the mainstream media, I believe you would be almost as
unhappy as I would. It's a shame because we need to be taking seriously the issues you raise. But to
talk about them, we need to get past the notion that the Web is all dreck all the time and that it is
nothing but a great "seducer" of taste.
For example, you're right that we're in the middle of a disruption of the professional media "ecosystem,"
as you aptly call it. Some of our professional media are faltering before we have built their online
replacements. It's frightening, especially if you're delighted with the existing mass media. But, the
transition is hardly over. If these institutions have value, then providing that value on line is an

opportunity that may well be addressed by the market (have faith, Andrew!) or by the new economics of
cooperative social production expounded in Yochai Benkler's seminal "The Wealth of Networks" (which is
available, of course, in its entirety for free online). Further, these newly fashioned mechanisms for
delivering old-fashioned value will have their own advantages, as well as the weaknesses you note.
Wikipedia, if nothing else, is more complete and current than printed encyclopedias -- and we can quote
it at length without getting sued. iTunes enables some worthy musicians to find their own small
audiences. Open access scientific journals have made far more research (including peer reviewed
papers) available to scientists than ever before -- a good example of what I think of as the power of
making information miscellaneous. In fact, amateurs and professionals are getting "miscellanized" so
that their influence is proportional not to their status but to the value they contribute...and our
understanding of the professionals is being enhanced by their revealing more of their amateur, personal
side in their blogs.
Most of all, a serious discussion of amateurism has to be able to admit that it may have some benefits.
For example:
(1) Some amateurs are uncredentialed experts from whom we can learn.
(2) Amateurs often bring points of view to the table that the orthodoxy has missed, sometimes even
challenging the authority of institutions whose belief systems have been corrupted by power.
(3) Professional and expert ideas are often refined by being brought into conversation with amateurs.
(4) There can be value in amateur work despite its lack of professionalism: A local blogger's description
of a news story happening around her may lack grammar but provide facts and feelings that add to -- or
reveal -- the truth.
(5) The rise of amateurism creates a new ecology in which personal relationships can add value to the
experience: That a sister-in-law is singing in the local chorus may make the performance thoroughly
enjoyable, and that I've gotten to know a blogger through her blog makes her posts more meaningful to
me.
(6) Collections of amateurs can do things that professionals cannot. Jay Rosen, for example, has
amateur citizens out gathering distributed data beyond the scope of any professional news
organization.
(7) Amateur work helps us get over the alienation built into the mainstream media. The mainstream is
theirs. The Web is ours.
(8) That amateur work is refreshingly human -- flawed and fallible -- can inspire us, and not just seduce
us into braying like chimps.
Yes, Andrew, we are amateurs on the Web, although there's plenty of room for professionals as well. But
we are not replicating the mainstream media. We're building something new. We're doing it together. Its
fundamental elements are not bricks of content but the mortar of links, and links are connections of
meaning and involvement. We're creating an infrastructure of meaning, miscellaneous but dripping with
potential for finding and understanding what matters to us. We're building this for one another. We're
doing it by and large for free, for the love of it, and for the joy of creating with others. That makes us
amateurs. And that's also what makes the Web our culture's hope.
True Enough
By Steven Johnson
Farhad
Before I go into debate mode here, I wanted to start by saying how much I've enjoyed reading True
Enough. You have literally dozens of stories in the book that you've told wonderfully, but you've also
managed to connect them to illuminating research in psychology and sociology: the whole history of the
"Swift Boat" campaign, the 2004 Ohio election-fraud meme, 9/11 conspiracy theorists. It's an
entertaining and important mix of media theory, cultural criticism, and science journalism.
Of course, one of the central themes of True Enough is that we live in excessively partisan times, and in
that spirit, I'm now going to shift gears and explain why your argument is hopelessly wrong.
I'm kidding, but I think we do disagree on a couple of key points. I find myself agreeing thoroughly with
your assessment of the forces at work in each of your anecdotes. What I have trouble with is the global
conclusions you draw. You describe your thesis near the beginning: "The limitless choice we now enjoy
over the information we get about our world has loosened our grip on what isand isn'ttrue." At the

end you phrase it this way: "The particular way in which information now moves through societyon
currents of loosely linked online groups and niche media outfits, pushed along by experts and journalists
of dubious character, and bolstered by documents that are no longer considered proof of realty
amplifies deception."
Now, it seems to me that there are two ways to set about determining whether this interpretation is, in
fact, true. The first is the media-theory approach, which is to analyze the "particular way that
information now moves," thanks to the Web and other modern media forms, and to try to gauge
whether there is indeed something structural to these new forms that amplifies deception. The other
approach is to look at the problem from a sociological point of view: Is there a general increase in
falsehood, or blindly partisan interpretations of the world, that we can see around us, compared with
what we saw 20 or 50 years ago?
Let me try my hand quickly at both, and perhaps we can get into more detail in the next round. In terms
of the flow of information, there is no question that the Internet has made it vastly easier to share
complete fabricationsdelusional theories, libelous accusations, Photoshopped fantasieswith other
human beings. (Just think about the spam!) That we agree on. But I think it is equally true that the rise
of the Internet has made it vastly easier to share useful, factual information with other human beings.
Because the media landscape is so much more interconnectedthanks largely to the innovation of
hypertext (and to Google)it has also never been easier to fact-check a given piece of information. We
had plenty of urban myths during my childhood in the 1970s and early 1980s, but we didn't have
Snopes.com to debunk them.
Saying that the Web amplifies deception is, to me, a bit like saying that New York is more dangerous
than Baltimore because it has more murders. Yes, in absolute numbers, there are more untruths on the
Web than we had in the heyday of print or mass media, but there are also more truths out there. We've
seen that big, decentralized systems like open-source software and Wikipedia aren't perfect, but over
time they do trend toward more accuracy and stability. I think that will increasingly be the case as more
and more of our news migrates to the Web.
That's why I think it's important to note that many of your key examples are dependent on old-style,
top-down media distribution. You talk about the American public's continuing belief in a connection
between 9/11 and Saddam Hussein; the Swift Boat Veteran ads that distorted the truth of Kerry's record;
Lou Dobbs ranting on CNN. These are all distortions that speak to the power of the old mass-media
model or the even older political model of the executive branch. (I think it's telling that you only spent a
page or two on the successful fact-checking of the forged CBS draft-dodging memos.) As you say in the
book, the Swift Boat meme didn't take off until the group started running television ads. Americans
don't connect Saddam to 9/11 because of distributed online niche groups; they make that connection
because the vice president of the United States repeatedly went on television to keep the connection
alive. That's as old-school as it gets.
This leads to the sociological question. One way to think about it is to look at conspiracy theories, which
play a prominent role in True Enough. If your premise were right, the new media landscape would have
made our culture more amenable to these theories than ever. I don't exactly know how to go about
proving this, but I think there's a very strong argument that the country is significantly less
conspiratorially minded than it was in the late 1960s and 1970s. Think of the litany from that period:
JFK, Castro, faked moon landings, "Paul Is Dead," Roswell. (For what it's worth, the conspiracy page at
Wikipedia is dominated by these outdated theories, but perhaps that itself is a conspiracy.) Yes, we have
the 9/11 "truth movement" theories, but we also have a number of dogs that didn't bark. Think about
the anthrax attacks of 2001a major act of terrorism against prominent people that has not been
solved, and yet there are almost no well-known crank theories about that in circulation. If that had
happened in the 1970s, Oliver Stone would be making a movie about it right about now.
And then there is the premise that we live in increasingly partisan political times, where our worldviews
have diverged so much that we can't agree on basic truths. This is, of course, conventional wisdom
people make offhand references to our partisan political culture all the timebut I think it is a bizarre
form of political amnesia. Think back again to the 1950s, '60s, and '70s. Yes, we have Fox News, but we
no longer have lynch mobs. We no longer have people getting fire-hosed by the authorities because
they want to ride in the front of the bus or war protesters killed on their campuses. We no longer have
radical political groups with significant followings arguing for violent revolution; we haven't had a

politically motivated assassination attempt in decades. We have broad public consensus on the role of
women and minorities in government and the workforce. We no longer have major political figures
denouncing the Communists lurking among us. Yes, the right hates the Clintons, and the left hates
Bush, but the left hated Nixon just as much, and some of them hated LBJ for good measure. There is far
more consensus in the country's political values than there was 30 years ago. We agree on much more
than we did back then.
I admit that one thing has changed: Our political culture looks more partisan on television than it did
back then, in the sense that Bill O'Reilly is more partisan in style and substance than, say, Cronkite was.
But as you know better than anyone, Farhad, just because it's on television, it doesn't mean it's true.
Steven
*
Steven,
As a longtime fan of your work, I'm tickled by your kind words, and I'm honored to have the chance to
debate True Enough with you here. That said, let's get ready to rumble.
You do a nice job summarizing my ideas, but I want to point out that True Enough isn't about the
Internet alone. I've got to say this in order to squash the chargewhich you don't make, but which I fear
others mightthat I'm some kind of Luddite. I've spent a career writing for the Web, I get most of my
news online, and I consider Boing Boing a national treasure.
So my beef is not with the Internet, exactly, but with the entire modern infosphere: blogs, cable news,
talk radio, YouTube, podcasts, on-demand book publishing, etc. In the era of mass mediathe 60-year
span, give or take, between the advent of television and the advent of the Webwe got all our news
from a handful of major sources. Now we get our news from all sides, from amateurs and professionals
who span Chris Anderson's famously long tail of niche outlets.
I think you and I agree that this shift will profoundly alter society, and that some changes will be for the
good and some will be for the bad. Where we disagree is the bottom line: When it comes to that grand,
gauzy thing called Truth, I think niche media will do more harm than good, at least for the foreseeable
future.
You're right, the Internet is a boon for fact-checking. But how useful is fact-checking if the facts and the
lies shuttle about in entirely separate cultural universes?
In the book, I spend much time on Leon Festinger's theory of "selective exposure"the idea that in
order to avoid cognitive dissonance, we all seek out information that jibes with our beliefs and avoid
information that conflicts with them. While the theory is controversial, there's ample evidence that
selective exposure plays a role in how people parse the news today. Survey data show that folks on the
right and folks on the left now swim in very different news pools. Right-wing blogs link to righty sites,
while left-wing blogs link to lefty sites. For example, see Lada Adamic and Natalie Glance's study (PDF)
or consider this experiment by Shanto Iyengar and Richard Morin: If you slap the Fox News logo on a
generic news storyeven a travel or sports story, something completely nonpoliticalRepublicans'
interest in it shoots up, while Democrats' interest plummets. People now choose their newsand thus
their factsthrough a partisan lens.
Yes, the Swift Boat campaign exploded when it hit TV, but I wouldn't say it depended on "old-style topdown media distribution." The TV we're talking about is cable news, especially Fox: the very definition of
a niche partisan outlet. (Fox's biggest show, The O'Reilly Factor, attracts about 4 million viewers a
night; that's big for cable, but it's not the mainstream.)
The Swift Boaters initially tried to go the old-media way. In May 2004, they held a press conference at
the National Press Club to announce that John Kerry had lied about his time in Vietnam. Reporters from
every old-media shop in town showed up, but most dismissed the group. Bereft, the vets went to the
Web and talk radio, where they found an audience that lapped up their claims. It was only by winning
some fame in these media that the vets garnered a few big donors and, eventually, interest from cable
TV. Broadcast news networks, the Associated Press, and national newspapers came to the story much

later on. And their role was salutaryonline, in print, and on TV, the old-media outlets fact-checked the
Swift Boaters very well, debunking most of their claims. But did the facts hurt the story? Not really.
On 9/11 and Saddam: Would Dick Cheney have been able to convince the nation of that connection
without a partisan press apparatusLimbaugh, Drudge, O'Reilly, the Freepersat his back? We can't
know, of course. I think it's telling, though, that a large percentage of Americans continued believing the
lie long after even Cheney and the rest of the administration disavowed it. To me, this suggests that the
story was propelled by forces far stronger than the vice president. It persistedand persiststhanks to
niche partisan outlets and despite the facts of the matter being available to all online.
Of course, you're right that society has found a consensus on many of the most dogged issues of our
past. But True Enough doesn't argue that we are markedly more partisan today than we once were.
Rather, I'm saying that our partisanship is of a different character. The big historical controversies you
mention involved questions of political valuesfor example, what should be the proper role of women
and minorities in society? Disagreement over an issue like global warming, though, doesn't concern
values. It's a difference over facts: If you believe the science on global warming, you think we should do
something about it. But if you're among the 20 percent to 40 percent of Americans who subscribe to
different facts on the question, you don't. And on many big issuesthe war, terrorism, several areas of
science, even the state of the economyAmericans today not only hold different opinions from one
another; they hold different facts.
As for conspiracy theories, I can assure you that an alleged governmental role in 9/11 isn't the only
thing keeping paranoid Americans up at night. Have you heard Robert F. Kennedy Jr.'s theory on
vaccines and autism (championed now by John McCain)? Or Kennedy's theories on the stolen election of
2004? What about the NAFTA superhighway? Or HIV denialists? Really, I could go on.
Farhad
*
Farhad,
To ensure that you and I don't end up swimming in different pools, let me try to spell out quickly where
we agree. First, the modern infosphere is dramatically more diverse in the number and range of
perspectives now available. (Taking your cue, I'm referring to the whole panoply here: the Web, cable,
talk radio, and so on.) Second, ordinary people have far more control over the perspectives they are
exposed to, thanks both to the diversity of media platforms and to the long tail of viewpoints they
support. Third, that infosphere is now far more densely interconnectedon the Web, of course, but also
on cable. (Bill O'Reilly is just a couple of clicks on the remote away from Keith Olbermann, after all.)
I agree completely that you can use these three developments to build an ideological cocoon for
yourself if you so choose. But you can also use them to expose yourself to an incredible range of ideas
and perspectivesto challenge your assumptions, fact-check arguments, understand where your
opponents are coming from, and stitch together your own informed worldview out of those multiple
realities. I realize that description sounds ridiculously high-minded. (Even the most urbane Web
polymath goes for a little partisan red meat every now and then.) But let's think of it, for our purposes,
as the caricature on the other side of the spectrum, the opposite of the dittohead who doesn't believe
anything unless he hears it straight from Rush's mouth.
What we're trying to figure out is which pole has a stronger magnetic force in this new world: the
dittohead or the polymath. In such a connected environment, truth should be able to spread more
quickly through the system, assuming people have an interest in truth. But if people are more driven by
selective exposurefinding online information that confirms what they already believethen the
system will let them keep truth at bay, assuming their beliefs are untrue. (By the way, I loved the
sections of your book on the science of selective exposure.)
You invoke the 20 percent to 40 percent of Americans who don't believe the science of global warming
as evidence that the forces of selective exposure are stronger than those of truth-seeking. But the
percentage of Americans who have both heard of and believe in human-caused climate change has
been growing steadily for the last 15 years. Many more Americans now pursue green lifestylesin their

choice of cars, in the products they buy, and in the food they eat. So there's no question the science is
making progress and winning converts at a steady rate. But just like the political struggles that
dominated the '50s and '60swhich were about facts as much as values, contrary to what you claim
the conversion process takes time. It's frustrating that the change can't happen overnight, but no more
frustrating than it was listening to bigots invoking the pseudosciences of sexism or racism in the '60s.
I suppose the great, untestable question on global warming is this: If we could rewind the clock and
somehow build an international scientific consensus about global warming in, say, 1950, would the
American public have embraced the reality of the threat and the need for change more quickly? We'll
never know, of course. It would be interesting to compare the spread of information in the post- Silent
Spring era, to see whether the environmental science of that period reached a broad public consensus
faster than global-warming science has in recent years. If youor any of Slate's readersknow of
studies along those lines, I'd love to hear about them.
But we do have one clear social experiment that we can look at on the dittohead-vs.-polymath question.
If you and I had been having this debate back in 1990, right as the new infosphere was coming into
beingtalk radio ascendant, online communities starting to take shapepresumably your prediction
would have been that the forces of selective exposure in this new world would drive people into those
different pools of information, confirming and amplifying their existing beliefs, strengthening their
alliances to their initial tribe, and growing further away from those with different perspectives. My
prediction, on the other hand, would have been that the connective, diversifying properties of this new
world would express themselves in the opposite direction: people breaking free from the party lines and
creating more eclectic political worldviews, stitched together from the diverse experiences that they can
now encounter on the screen.
What actually happened during that period? Through all the swings back and forth between the two
parties, the single most pronounced trend since the early '90s is the steady rise of Americans who
consider themselves independent voters, unaligned with either party. (They have tripled in size during
that period, by some measures.) Yes, the new information paradigm has been a boon to people who
believe only what O'Reilly (or Michael Moore) has to say. But all those independents make me think that
the common groundthe space that connects the poolshas become an even more popular place to
be.
Steven
*
Steven,
For two people in a debate over cultural rifts, you and I sure are agreeing on a great deal. I suppose
that's one positive sign.
That's why I hate to turn this, now, into the most tedious sort of fightone about interpreting voter
stats. There's voluminous poli-sci research on the recent rise of independent voters, and the picture isn't
as clear-cut as you say. Yes, the share of Americans who identify as independents has grown over the
past couple of decades. At the same time, though, the meaning of independence has shifted: Most
unaligned voters now exhibit strong, pseudo-permanent preferencesin surveys as well as in voting
behaviorfor one party or another. The number of what you might call "pure" independentsvoters
who pick candidates without regard to party and ideologyhas been steadily declining.
And don't overlook all the other signs of growing political polarization. Americans who do identify with
parties are now much less willing than in the past to vote across party lines. In the 1970s, liberals
frequently voted (PDF) for Republicans and conservatives for Democrats. We don't see that sort of
behavior anymore. Congress has also grown steadily more partisan; party-line votes on all but the most
inconsequential of issues are now the norm. Just look at what's happened to John McCain in the last 10
yearshe was against Bush's tax cuts before he was for them, which pretty much says it all, no?
Can we blame the new infosphere for this new partisanship? It certainly doesn't deserve all the blame.
Gerrymandering, lobbying, campaign-finance rules, 9/11, and Tom DeLay, among other things, have
also likely contributed to polarization. The rise of voters who call themselves "independent"

notwithstanding, we've seen few signs, since 1990, of people reaching for common ground.
I agree with you on global warming. Though a large number of Americans still dismisses the science, it
does look like facts about climate change are slowly washing over the culture. But let's not forget your
question: Would the public of the 1950sthe mass-media publichave accepted the facts sooner than
the public of the 2000s, the niche-media public? The question, as you say, is untestable.
But on Rachel Carson: Silent Spring was first serialized in The New Yorker in the summer of 1962, and it
came out as a book that September. It was a Book-of-the-Month Club title, and quickly hit the New York
Times best-seller list. In 1963, CBS Reports, a 60 Minutes-type show, broadcast an hourlong report on
Carson's thesis that the pesticide DDT was causing ecological damage. This was back when one-third of
the nation watched CBSwe're talking American Idol-type ratings.
The chemical industry mounted a huge counterattack in the media. Carson was called a "hysterical
woman," assailed as an alarmist, and accused of overlooking all the benefits of DDT. The charges didn't
stick. John F. Kennedy's science advisory panel looked into Silent Spring's thesis and supported its
claims. The industry largely backed down, and within a few years the government began to regulate
DDT. In 197210 years after Silent Spring's publication, under a Republican administrationthe
pesticide was banned for use in the United States.
Just 10 years! Can you imagine the fate that would await Silent Spring if it were serialized in The New
Yorker today? You can guess it would get some play: NPR and the big newspapers would go after the
story; sites like TreeHugger and Grist and maybe Slate and Salon would discuss it; perhaps the network
news would interview Carson; and maybe cable news would get to it, too.
But picture the fun Fox News and right-wing blogs would have with it. Carson had researched DDT's
effects on the environment for years, but the science was not airtight; there was, as in any emerging
field of study, legitimate disagreement among experts over the scope of the problem and the remedy.
Today, the right would surely distort that disagreement.
In True Enough, I describe the scourge of dubious "expertise" we now see in the mediapeople of
questionable credentials (sometimes with undisclosed financial interests) who are called on by TV
producers to discuss matters about which they've got no special knowledge. I'll hazard that such
experts would flood the zone to fight Carson today, just as they do on global warming. The antienvironmentalists would produce pseudo-scientific research of their own showing how DDT harms only
terrorists, and in fact helps bald eagles live longer, happier lives. This stuff, then, would get passed
around the right, attaining a measure of respect and becoming a kind of parallel truth. How long till
Glenn Beck begins comparing Carson to Hitler?
All speculation, of course. But that seems to me a pretty good template for how objective facts are
churned out through the news these days. If Silent Spring were published today, would it lead to a ban
of DDT? Maybe. But not fast enough, I worry.
I'd been looking forward to this debate, Steven; it's been fun. I probably haven't changed your mind
about the Internet's role in societyand you haven't changed minebut here's hoping that a civil chat
between rivals serves as a model for others online.
Farhad

In-class resources for Thu Oct 22 (class 10)


Brave New World
Digerati video and reading
(discussion leader: x)
Digerati
Howard Rheingold
@hrheingold
BHSEC Q guest author (Fall 2015)
https://en.wikipedia.org/wiki/Howard_Rheingold
Video
We will watch and discuss this in class.
Howard Rheingold TED Talk (19:31 we will watch 6 minutes of this video)
http://www.ted.com/talks/howard_rheingold_on_collaboration?language=en
Reading
We will read and discuss this in class.
Howard Rheingold: Quotes
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)

http://www.brainyquote.com/quotes/authors/h/howard_rheingold.html
Openness and participation are antidotes to surveillance and control.
There is never going to be a substitute for face-to-face communication, but we have seen since
the alphabet, to the telephone and now the Internet, that whenever people find a new way to
communicate, they will flock to it.
You can't have an industrial revolution, you can't have democracies, you can't have populations
who can govern themselves until you have literacy. The printing press simply unlocked literacy.
Of course, with agriculture came the first big civilizations, the first cities built of mud and brick,
the first empires. And it was the administers of these empires who began hiring people to keep
track of the wheat and sheep and wine that was owed and the taxes that was owed on them by
making marks; marks on clay in that time.
Democracy is not just voting for your leaders; it's really premised upon ordinary citizens
understanding the issues.
We are moving rapidly into a world in which the spying machinery is built into every object we
encounter.
Howard Rheingold
You can't assume any place you go is private because the means of surveillance are becoming so
affordable and so invisible.
The more material there is, the more need there is for filters. You don't need a printing press
anymore, but you do need people who know how to cultivate sources, double-check information
and put the brand of legitimacy on it.
Mobile communications and pervasive computing technologies, together with social contracts
that were never possible before, are already beginning to change the way people meet, mate,
work, war, buy, sell, govern and create.
Humans are humans because we are able to communicate with each other and to organize to do
things together that we can't do individually.
Its not a global village, but we're in a highly interconnected globe.
You can't pick up the telephone and say, 'Connect me with someone else who has a kid with
leukemia.'
Markets are as old as the crossroads. But capitalism, as we know it, is only a few hundred years
old, enabled by cooperative arrangements and technologies, such as the joint-stock ownership
company, shared liability insurance, double-entry bookkeeping.
As for Twitter, I've found that you have to learn how to make it add value rather than subtract
hours from one's day. Certainly, it affords narcissism and distraction.
I want to be very careful about judging and how much to generalize about the use of media
being pathological. For some people, it's a temptation and a pathology; for others, it's a lifeline.
Mindfulness means being aware of how you're deploying your attention and making decisions
about it, and not letting the tweet or the buzzing of your BlackBerry call your attention.
Although we leave traces of our personal lives with our credit cards and Web browsers today,
tomorrow's mobile devices will broadcast clouds of personal data to invisible monitors all around
us.

Attention is the fundamental instrument we use for learning, thinking, communicating, deciding,
yet neither parents nor schools spend any time helping young people learn how to manage
information streams and control the ways they deploy their attention.
People's social networks do not consist only of people they see face to face. In fact, social
networks have been extending because of artificial media since the printing press and the
telephone.
Personal computers were created by some teenagers in garages because the, the wisdom of the
computer industry was that people didn't want these little toys on their desk.
Some critics argue that a tsunami of hogwash has already rendered the Web useless. I disagree.
We are indeed inundated by online noise pollution, but the problem is soluble.
The Amish communities of Pennsylvania, despite the retro image of horse-drawn buggies and
straw hats, have long been engaged in a productive debate about the consequences of
technology.
Until fairly recently, Amish teachers would reprimand the student who raised his or her hand as
being too individualistic. Calling attention to oneself, or being 'prideful,' is one of the cardinal
Amish worries. Having your name or photo in the papers, even talking to the press, is almost a
sin.
A forecasting game is a kind of simulation, a kind of scenario, a kind of teleconference, a kind of
artifact from the future - and more - that enlists the participants as 'first-person forecasters.'
The idea that your spouse or your parents don't know where you are at all times may be part of
the past. Is that good or bad? Will that make for better marriages or worse marriages? I don't
know.
Kids automatically teach each other how to use technology, but they're not going to teach each
other about the history of democracy, or the importance of taking their voices into the public
sphere to create social change.
Any disease support community is a place of deep bonds and empathy, and there are thousands
if not tens of thousands of them.
It's kind of astonishing that people trust strangers because of words they write on computer
screens.
Some digital natives are extraordinarily savvy.
What person doesn't search online about their disease after they are diagnosed?
Unlike with the majority of library books, when you enter a term into a search engine there is no
guarantee that what you will find is authoritative, accurate or even vaguely true.
Any virtual community that works, works because people put in some time.
By the time you get a job, you know how to behave in a meeting or how to write a simple memo.
When designers replaced the command line interface with the graphical user interface, billions of
people who are not programmers could make use of computer technology.
A phone tree isn't an ancient form of political organizing, but you have to call every person.
Advertising in the past has been predicated on a mass market and a captive audience.
I've spent my life alone in a room with a typewriter.

Like most modern Americans, I assume individuality is not only a fundamental value, but a goal
in life, an art form.
Soon the digital divide will not be between the haves and the have-nots. It will be between the
know-hows and the non-know-hows.
Technology is my native tongue. I'm online six hours a day.
A lot of people use collaborative technologies badly, then abandon them. They aren't 'plug-andplay.' The invisible part is the social skill necessary to use them.
One thing we didn't know in 1996 is that it's very, very difficult, if not impossible, to sustain a
culture with online advertising.
We like technology because we don't have to talk to anybody.
People look at me, and I dress a little unusually and they think, 'Oh you must be from California.'
Of course, people in California think, 'Oh you must be from from Mars,' so, you know, your nextdoor neighbour is not necessarily the person that you are going to make a connection with.
The Orwellian vision was about state-sponsored surveillance. Now it's not just the state, it's your
nosy neighbor, your ex-spouse and people who want to spam you.
There is an elementary level of trust that is necessary for community. You have to be able to
trust that your neighbors aren't going to look into your mailbox.
Entire books are being written about the distractions of social media. I don't believe media
compel distraction, but I think it's clear that they afford it.
It used to be that if your automobile broke, the teenager down the street with the wrench could
fix it. Now you have to have sophisticated equipment that can deal with microchips. We're
entering a world in which the complexity of the devices and the system of interconnecting
devices is beyond our capability to easily understand.
Open source production has shown us that world-class software, like Linux and Mozilla, can be
created with neither the bureaucratic structure of the firm nor the incentives of the marketplace
as we've known them.
Technologies evolve in the strangest ways. Computers were created to calculate ballistics
equations, and now we use them to create amusing illusions. Creating amusing illusions is a big
business if you play it right.
Technology no longer consists just of hardware or software or even services, but of communities.
Increasingly, community is a part of technology, a driver of technology, and an emergent effect
of technology.
We already know that spam is a huge downside of online life. If we're going to be spammed on
our telephones wherever we go, I think we're going to reject these devices.
We think of them as mobile phones, but the personal computer, mobile phone and the Internet
are merging into some new medium like the personal computer in the 1980s or the Internet in
the 1990s.
The two parts of technology that lower the threshold for activism and technology is the Internet
and the mobile phone. Anyone who has a cause can now mobilize very quickly.
Flash mobbing may be a fad that passes away, or it may be an indicator of things to come.
Humans have lived for much, much longer than the approximately 10,000 years of settled
agricultural civilization.

I think e-mail petitions are an illusion. It gives people the illusion that they're participating in
some meaningful political action.
Inexpensive phones and pay-as-you go services are already spreading mobile phone technology
to many parts of that world that never had a wired infrastructure.
The Chinese government tried to keep a lid on the SARS crisis, but there were 160 million text
messages in three days sent by Chinese citizens. These are early indications that it's going to be
difficult for people who used to have control over the news to maintain that level of control.
Craigslist is about authenticity. Craig has paid his dues, and people respect him.
I'm somebody who seems to stumble into things 10 or 20 years before the rest of the world
does.
It's more important to me to get an e-mail that says, 'I saw your page and it changed my life,'
than how many hits the page got.
On the Internet, it is assumed people are in business to sell out, not to build something they can
pass along to their grandkids.
People move from place to place and job to job, but they no longer need to lose touch.
The AP has only so many reporters, and CNN only has so many cameras, but we've got a world
full of people with digital cameras and Internet access.
There's a direct relationship between how difficult it is to send a message and how strongly it is
received.
Whenever a technology enables people to organize at a pace that wasn't before possible, new
kinds of politics emerge.
Journalists don't have audiences - they have publics who can respond instantly and globally,
positively or negatively, with a great deal more power than the traditional letters to the editor
could wield.
People's behavior will change with technology. I know very few young people who can't type out
a text message on their phone with one thumb, for instance.
Schoolchildren are not taught how to distinguish accurate information from inaccurate
information online - surely there are ways to design web-browsers to help with this task and
ways to teach young people how to use the powerful online tools available to them.
Communicating online goes back to the Defense Department's Arpanet which started in 1969.
There was something called Usenet that started in 1980, and this gave people an opportunity to
talk about things that people on these more official networks didn't talk about.
I think there are two aspects to smart environments. One is information embedded in places and
things. The other is location awareness, so that devices we carry around know where we are.
When you combine those two, you get a lot of possibilities.
Young voters are crucial. The trend over recent years has been for them to drift away. So
anything that gets young voters interested in the electoral process not only has an immediate
effect, but has an effect for years and years.
Democracy is not just voting for your leaders; it's really premised upon ordinary citizens
understanding the issues.
ho'oponopono (Hawaiian):

Solving a problem by talking it out. After an invocation of the gods, the aggrieved parties sit
down and discuss the issue until it is set right (pono means righteousness).
ngaobera:
a slight inflammation of the throat produced by screaming too much.
fisselig (German):
Flustered to the point of incompetence. A temporary state of inexactitude and sloppiness that is
elicited by another person's nagging.
Shirky has described the pre-Web era of publishing as working on a filter, then publish
paradigm, subjecting text to editors and publishers before making it available; now, Shirky
observes, the paradigm has flipped to publish then filter.103 In that sense, Shirky adds, there
is no such thing as information
One of the most astonishing statistics McGonigal cites in her book is the estimate that gamers
have spent 5.93 million years playing World of Warcraft.22
When todays infants grow up, they will be amazed that their parents generation could ever get
lost, not be in touch with everyone they know at all times, and get answers out of the air for any
question.
If the rule of thumb for attention literacy is to pay attention to your intention, then the heuristic
for crap detection is to make skepticism your default.

Sunday Story #6
Due Sun Oct 25 (before midnight)
Prompt

vimeo-rheingold quotes

Homework for Tue Oct 27 (class 11)


Guest author: Howard Rheingold
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Video
Watch the assigned video on Zaption and post a one-sentence response.
Rheingold - 21st century media literacies (6:09)
http://zapt.io/tb6rssm3
Online discussion
Respond to this prompt on Canvas:
Question for Howard Rheingold
link

Rheingold

In-class resources for Tue Oct 27 (class 11)

Guest author: Howard Rheingold


Digerati video and reading
(discussion leader: x)
Digerati
Matt Drudge
https://en.wikipedia.org/wiki/Matt_Drudge
www.drudgereport.com
Video
We will watch and discuss this in class.
Alex Jones on Matt Drudge (15:55)
https://www.youtube.com/watch?v=vCPNUKQ05mo
Reading
We will read and discuss this in class.
Drudge Manifesto
(Matt Drudge, 2000)
Links
(on Canvas lesson page)
Matt Drudge
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Matt Drudge

Matt Drudge and the Drudge Report


Andrew Shapiro, The Control Revolution (1999)
To begin with, the accuracy and integrity of information may be uncertain when it comes from
sources that don't have editors and fact-checkers, and major reputations at stake. This is mostly
a function of resources, as it takes time and expertise to do careful reporting and checking of
facts. I remember asking Matt Drudge at the Harvard conference whether, with his increasing
prominence, he had any plans to hire a staff to help him put out the Drudge Report. He scoffed
at me and said no, as if I just didn't get it. Now, I think I do get it. The Drudge factor that is,
the extreme disintermediation of our information environment means that responsibility for
determining truth rests as much with those who consume information as with those who produce
it. This is an archetypal example of the control revolution, for it represents a clear transfer in
power over one of our most important social functions: who distinguishes fact from fiction and
ultimately determines what is true. Increasingly, we bear that burden. Increasingly, responsibility
for determining truth rests as much with those who consume information as with those who
produce it.
Drudge Manifesto

Homework for Thu Oct 29 (class 12)


Texts Without Context
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Texts Without Context
(Michiko Kakutani, NYT, 2011)
Video
Watch the assigned video on Zaption and post a one-sentence response.
Lawrence Lessig on Colbert (6:38)
http://zapt.io/thy9czjp
Online discussion
Respond to this prompt on Canvas:
Kakutani: Texts Without Context - In what ways has the internet affected how (and what) you read?
https://canvas.instructure.com/courses/936923/discussion_topics/4017334

Texts without context


Michiko Kakutani, The New York Times (2010)
1
In his deliberately provocative and deeply nihilistic new book, Reality Hunger, the
onetime novelist David Shields asserts that fiction has never seemed less central to the
cultures sense of itself. He says hes bored by out-and-out fabrication, by myself and others;
bored by invented plots and invented characters and much more interested in confession and
reality-based art. His own book can be taken as Exhibit A in what he calls recombinant or
appropriation art.
2
Mr. Shieldss book consists of 618 fragments, including hundreds of quotations taken from other
writers like Philip Roth, Joan Didion and Saul Bellow quotations that Mr. Shields, 53, has taken
out of context and in some cases, he says, also revised, at least a little for the sake of
compression, consistency or whim. He only acknowledges the source of these quotations in an
appendix, which he says his publishers lawyers insisted he add.
3
Who owns the words? Mr. Shields asks in a passage that is itself an unacknowledged reworking
of remarks by the cyberpunk author William Gibson. Who owns the music and the rest of our
culture? We do all of us though not all of us know it yet. Reality cannot be copyrighted.
4
Mr. Shieldss pasted-together book and defense of appropriation underscore the contentious
issues of copyright, intellectual property and plagiarism that have become prominent in a world
in which the Internet makes copying and recycling as simple as pressing a couple of buttons. In
fact, the dynamics of the Web, as the artist and computer scientist Jaron Lanier observes in
another new book, are encouraging authors, journalists, musicians and artists to treat the
fruits of their intellects and imaginations as fragments to be given without pay to the hive mind.
5
Its not just a question of how these content producers are supposed to make a living or
finance their endeavors, however, or why they ought to allow other people to pick apart their
work and filch choice excerpts. Nor is it simply a question of experts and professionals being
challenged by an increasingly democratized marketplace. Its also a question, as Mr. Lanier, 49,
astutely points out in his new book, You Are Not a Gadget, of how online collectivism, social
networking and popular software designs are changing the way people think and process
information, a question of what becomes of originality and imagination in a world that prizes
metaness and regards the mash-up as more important than the sources who were mashed.
6
Mr. Laniers book, which makes an impassioned case for a digital humanism, is only one of
many recent volumes to take a hard but judicious look at some of the consequences of new
technology and Web 2.0. Among them are several prescient books by Cass Sunstein, 55, which
explore the effects of the Internet on public discourse; Farhad Manjoos True Enough, which
examines how new technologies are promoting the cultural ascendancy of belief over fact; The
Cult of the Amateur, by Andrew Keen, which argues that Web 2.0 is creating a digital forest of
mediocrity and substituting ill-informed speculation for genuine expertise; and Nicholas Carrs
book The Shallows (coming in June), which suggests that increased Internet use is rewiring our
brains, impairing our ability to think deeply and creatively even as it improves our ability to
multitask.
7
Unlike Digital Barbarism, Mark Helprins shrill 2009 attack on copyright abolitionists, these
books are not the work of Luddites or technophobes. Mr. Lanier is a Silicon Valley veteran and a
pioneer in the development of virtual reality; Mr. Manjoo, 31, is Slates technology columnist; Mr.
Keen is a technology entrepreneur; and Mr. Sunstein is a Harvard Law School professor who now
heads the White House Office of Information and Regulatory Affairs. Rather, these authors books
are nuanced ruminations on some of the unreckoned consequences of technological change

books that stand as insightful counterweights to early techno-utopian works like Esther Dysons
Release 2.0 and Nicholas Negropontes Being Digital, which took an almost Pollyannaish view
of the Web and its capacity to empower users.
8
THESE NEW BOOKS share a concern with how digital media are reshaping our political and social
landscape, molding art and entertainment, even affecting the methodology of scholarship and
research. They examine the consequences of the fragmentation of data that the Web produces,
as news articles, novels and record albums are broken down into bits and bytes; the growing
emphasis on immediacy and real-time responses; the rising tide of data and information that
permeates our lives; and the emphasis that blogging and partisan political Web sites place on
subjectivity.
9
At the same time its clear that technology and the mechanisms of the Web have been
accelerating certain trends already percolating through our culture including the blurring of
news and entertainment, a growing polarization in national politics, a deconstructionist view of
literature (which emphasizes a critics or readers interpretation of a text, rather than the texts
actual content), the prominence of postmodernism in the form of mash-ups and bricolage, and a
growing cultural relativism that has been advanced on the left by multiculturalists and radical
feminists, who argue that history is an adjunct of identity politics, and on the right by creationists
and climate-change denialists, who suggest that science is an instrument of leftist ideologues.
10
Even some outspoken cheerleaders of Internet technology have begun to grapple with some of
its more vexing side effects. Steven Johnson, a founder of the online magazine Feed, for
instance, wrote in an article in The Wall Street Journal last year that with the development of
software for Amazon.coms Kindle and other e-book readers that enable users to jump back and
forth from other applications, he fears one of the great joys of book reading the total
immersion in another world, or in the world of the authors ideas will be compromised. He
continued, We all may read books the way we increasingly read magazines and newspapers: a
little bit here, a little bit there.
11
Mr. Johnson added that the books migration to the digital realm will turn the solitary act of
reading a direct exchange between author and reader into something far more social and
suggested that as online chatter about books grows, the unity of the book will disperse into a
multitude of pages and paragraphs vying for Googles attention.
12
WORRYING ABOUT the publics growing attention deficit disorder and susceptibility to
information overload, of course, is hardly new. Its been 25 years since Neil Postman warned in
Amusing Ourselves to Death that trivia and the entertainment values promoted by television
were creating distractions that threatened to subvert public discourse, and more than a decade
since writers like James Gleick (Faster) and David Shenk (Data Smog) described a culture
addicted to speed, drowning in data and overstimulated to the point where only sensationalism
and willful hyperbole grab peoples attention.
13
Now, with the ubiquity of instant messaging and e-mail, the growing popularity of Twitter and
YouTube, and even newer services like Google Wave, velocity and efficiency have become even
more important. Although new media can help build big TV audiences for events like the Super
Bowl, it also tends to make people treat those events as fodder for digital chatter. More people
are impatient to cut to the chase, and theyre increasingly willing to take the imperfect but
immediately available product over a more thoughtfully analyzed, carefully created one. Instead
of reading an entire news article, watching an entire television show or listening to an entire
speech, growing numbers of people are happy to jump to the summary, the video clip, the sound
bite never mind if context and nuance are lost in the process; never mind if its our emotions,
more than our sense of reason, that are engaged; never mind if statements havent been
properly vetted and sourced.

14
People tweet and text one another during plays and movies, forming judgments before seeing
the arc of the entire work. Recent books by respected authors like Malcolm Gladwell (Outliers)
and Jane Jacobs (Dark Age Ahead) rely far more heavily on cherry-picked anecdotes instead
of broader-based evidence and assiduous analysis than the books that first established their
reputations. And online research enables scholars to power-search for nuggets of information
that might support their theses, saving them the time of wading through stacks of material that
might prove marginal but that might have also prompted them to reconsider or refine their
original thinking.
15
Reading in the traditional open-ended sense is not what most of us, whatever our age and level
of computer literacy, do on the Internet, the scholar Susan Jacoby writes in The Age of
American Unreason. What we are engaged in like birds of prey looking for their next meal
is a process of swooping around with an eye out for certain kinds of information.
16
TODAYS TECHNOLOGY has bestowed miracles of access and convenience upon millions of
people, and its also proven to be a vital new means of communication. Twitter has been used by
Iranian dissidents; text messaging and social networking Web sites have been used to help
coordinate humanitarian aid in Haiti; YouTube has been used by professors to teach math and
chemistry. But technology is also turning us into a global water-cooler culture, with millions of
people sending each other (via e-mail, text messages, tweets, YouTube links) gossip, rumors and
the sort of amusing-entertaining-weird anecdotes and photographs they might once have shared
with pals over a coffee break. And in an effort to collect valuable eyeballs and clicks, media
outlets are increasingly pandering to that impulse often at the expense of hard news. I have
the theory that news is now driven not by editors who know anything, the comedian and
commentator Bill Maher recently observed. I think its driven by people who are slacking off at
work and surfing the Internet. He added, Its like a country run by Americas Funniest Home
Videos.
17
MSNBCs new program The Dylan Ratigan Show, which usually focuses on business and
politics, has a While you were working ... segment in which viewers are asked to send in some
of the strangest and outrageous stories youve found on the Internet, and the most e-mailed
lists on popular news sites tend to feature articles about pets, food, celebrities and selfimprovement. For instance, at one point on March 11, the top story on The Washington Posts
Web site was Maintaining a Sex Life, while the top story on Reddit.com, a user-generated news
link site, was (Funny) Sexy Girl? Do Not Trust Profile Pictures!
18
Given the constant bombardment of trivia and data that were subjected to in todays
mediascape, its little wonder that noisy, Manichean arguments tend to get more attention than
subtle, policy-heavy ones; that funny, snarky or willfully provocative assertions often gain more
traction than earnest, measured ones; and that loud, entertaining or controversial personalities
tend to get the most ink and airtime. This is why Sarah Palins every move and pronouncement is
followed by television news, talk-show hosts and pundits of every political persuasion. This is
why Glenn Beck and Rush Limbaugh on the right and Michael Moore on the left are repeatedly
quoted by followers and opponents. This is why a gathering of 600 people for last months
national Tea Party convention in Nashville received a disproportionate amount of coverage from
both the mainstream news media and the blogosphere.
19
Digital insiders like Mr. Lanier and Paulina Borsook, the author of the book Cyberselfish, have
noted the easily distracted, adolescent quality of much of cyberculture. Ms. Borsook describes
tech-heads as having an angry adolescent view of all authority as the Pig Parent, writing that
even older digerati want to think of themselves as having an Inner Bike Messenger.
20

For his part Mr. Lanier says that because the Internet is a kind of pseudoworld without the
qualities of a physical world, it encourages the Peter Pan fantasy of being an entitled child
forever, without the responsibilities of adulthood. While this has the virtues of playfulness and
optimism, he argues, it can also devolve into a Lord of the Flies-like nastiness, with lots of
bullying, voracious irritability and selfishness qualities enhanced, he says, by the anonymity,
peer pressure and mob rule that thrive online.
21
Digital culture, he writes in You Are Not a Gadget, is comprised of wave after wave of
juvenilia, with rooms of M.I.T. Ph.D. engineers not seeking cancer cures or sources of safe
drinking water for the underdeveloped world but schemes to send little digital pictures of teddy
bears and dragons between adult members of social networks.
22
AT THE SAME time the Internets nurturing of niche cultures is contributing to what Cass Sunstein
calls cyberbalkanization. Individuals can design feeds and alerts from their favorite Web sites
so that they get only the news they want, and with more and more opinion sites and specialized
sites, it becomes easier and easier, as Mr. Sunstein observes in his 2009 book Going to
Extremes, for people to avoid general-interest newspapers and magazines and to make
choices that reflect their own predispositions.
23
Serendipitous encounters with persons and ideas different from ones own, he writes, tend to
grow less frequent, while views that would ordinarily dissolve, simply because of an absence of
social support, can be found in large numbers on the Internet, even if they are understood to be
exotic, indefensible or bizarre in most communities. He adds that studies of group polarization
show that when like-minded people deliberate, they tend to reinforce one another and become
more extreme in their views.
24
One result of this nicheification of the world is that consensus and common ground grow ever
smaller, civic discourse gets a lot less civil, and pluralism what Isaiah Berlin called the idea
that there are many different ends that men may seek and still be fully rational, fully men,
capable of understanding each other and sympathizing and deriving light from worlds,
outlooks, very remote from our own comes to feel increasingly elusive.
25
As Mr. Manjoo observes in True Enough: Learning to Live in a Post-Fact Society (2008), the way
in which information now moves through society on currents of loosely linked online groups
and niche media outlets, pushed along by experts and journalists of dubious character and
bolstered by documents that are no longer considered proof of reality has fostered deception
and propaganda and also created what he calls a Rashomon world where the very idea of
objective reality is under attack. Politicians and voters on the right and left not only hold
different opinions from one another, but often cant even agree over a shared set of facts, as
clashes over climate change, health care and the Iraq war attest.
26
THE WEBS amplification of subjectivity applies to culture as well as politics, fueling a
phenomenon that has been gaining hold over America for several decades, with pundits
squeezing out reporters on cable news, with authors writing biographies animated by personal
and ideological agendas, with tell-all memoirs, talk-show confessionals, self-dramatizing blogs
and carefully tended Facebook and MySpace pages becoming almost de rigeur.
27
As for the textual analysis known as deconstruction, which became fashionable in American
academia in the 1980s, it enshrined individual readers subjective responses to a text over the
text itself, thereby suggesting that the very idea of the author (and any sense of original intent)
was dead. In doing so, deconstruction uncannily presaged arguments advanced by digerati like
Kevin Kelly, who in a 2006 article for The New York Times Magazine looked forward to the day
when books would cease to be individual works but would be scanned and digitized into one

great, big continuous text that could be unraveled into single pages or reduced further, into
snippets of a page, which readers like David Shields, presumably could then appropriate
and remix, like bits of music, into new works of their own.
28
As John Updike pointed out, Mr. Kellys vision would in effect mean the end of authorship
hobbling writers ability to earn a living from their published works, while at the same time
removing a sense of both recognition and accountability from their creations. In a Web world
where copies of books (and articles and music and other content) are cheap or free, Mr. Kelly has
suggested, authors and artists could make money by selling performances, access to the
creator, personalization, add-on information and other aspects of their work that cannot be
copied. But while such schemes may work for artists who happen to be entrepreneurial, selfpromoting and charismatic, Mr. Lanier says he fears that for the vast majority of journalists,
musicians, artists and filmmakers it simply means career oblivion.
29
Other challenges to the autonomy of the artist come from new interactive media and from
constant polls on television and the Web, which ask audience members for feedback on
television shows, movies and music; and from fan bulletin boards, which often function like giant
focus groups. Should the writers of television shows listen to fan feedback or a networks
audience testing? Does the desire to get an article on a most e-mailed list consciously or
unconsciously influence how reporters and editors go about their assignments and approaches to
stories? Are literary-minded novelists increasingly taking into account what their readers want or
expect?
30
As reading shifts from the private page to the communal screen, Mr. Carr writes in The
Shallows, authors will increasingly tailor their work to a milieu that the writer Caleb Crain
describes as groupiness, where people read mainly for the sake of a feeling of belonging
rather than for personal enlightenment or amusement. As social concerns override literary ones,
writers seem fated to eschew virtuosity and experimentation in favor of a bland but immediately
accessible style.
31
For that matter, the very value of artistic imagination and originality, along with the primacy of
the individual, is increasingly being questioned in our copy-mad, postmodern digital world. In a
recent Newsweek cover story pegged to the Tiger Woods scandal, Neal Gabler, the author of
Life: the Movie: How Entertainment Conquered Reality, absurdly asserts that celebrity is the
great new art form of the 21st century.
32
Celebrity, Mr. Gabler argues, competes with and often supersedes more traditional
entertainments like movies, books, plays and TV shows, and it performs, he says, in its own
roundabout way, many of the functions those old media performed in their heyday: among them,
distracting us, sensitizing us to the human condition, and creating a fund of common experience
around which we can form a national community.
33
However impossible it is to think of Jon & Kate Plus Eight or Jersey Shore as art, reality shows
have taken over wide swaths of television, and memoir writing has become a rite of passage for
actors, politicians and celebrities of every ilk. At the same time our cultural landscape is
brimming over with parodies, homages, variations, pastiches, collages and others forms of
appropriation art much of it facilitated by new technology that makes remixing, and cuttingand-pasting easy enough for a child.
34
Its no longer just hip-hop sampling that rules in youth culture, but also jukebox musicals like
Jersey Boys and Rock of Ages, and works like The League of Extraordinary Gentlemen,
which features characters drawn from a host of classic adventures. Fan fiction and fan edits are
thriving, as are karaoke contests, video games like Guitar Hero, and YouTube mash-ups of music

and movie, television and visual images. These recyclings and post-modern experiments run the
gamut in quality. Some, like Zachary Masons Lost Books of the Odyssey, are beautifully
rendered works of art in their own right. Some, like J. J. Abrams 2009 Star Trek film and Amy
Heckerlings 1995 Clueless (based on Jane Austens Emma) are inspired reinventions of
classics. Some fan-made videos are extremely clever and inventive, and some, like a 3-D video
version of Picassos Guernica posted on YouTube, are intriguing works that
raise important and unsettling questions about art and appropriation.
All too often, however, the recycling and cut-and-paste esthetic has resulted in tired imitations;
cheap, lazy re-dos; or works of appropriation designed to generate controversy like Mr.
Shieldss Reality Hunger. Lady Gaga is third-generation Madonna; many jukebox or tribute
musicals like Good Vibrations and The Times They Are A-Changin do an embarrassing
disservice to the artists who inspired them; and the rote remaking of old television shows into
films (from The Brady Bunch to Charlies Angels to Get Smart), not to mention the
recycling of video games into movies (like Tomb Raider and Resident Evil) often seem as
pointless as they are now predictable.
35
Writing in a 2005 Wired article that new technologies redefine us, William Gibson hailed
audience participation and argued that an endless, recombinant, and fundamentally social
process generates countless hours of creative product. Indeed, he said, audience is as antique
a term as record, the one archaically passive, the other archaically physical. The record, not the
remix, is the anomaly today. The remix is the very nature of the digital.
36
To Mr. Lanier, however, the prevalence of mash-ups in todays culture is a sign of nostalgic
malaise. Online culture, he writes, is dominated by trivial mash-ups of the culture that
existed before the onset of mash-ups, and by fandom responding to the dwindling outposts of
centralized mass media. It is a culture of reaction without action.
37
He points out that much of the chatter online today is actually driven by fan responses to
expression that was originally created within the sphere of old media, which many digerati
mock as old-fashioned and pass, and which is now being destroyed by the Internet. Comments
about TV shows, major movies, commercial music releases and video games must be responsible
for almost as much bit traffic as porn, Mr. Lanier writes. There is certainly nothing wrong with
that, but since the Web is killing the old media, we face a situation in which culture is effectively
eating its own seed stock.

The "Matthew Effect"


Mark Bauerlein, The Dumbest Generation (2008)
Reading has a cumulative, developmental nature, a cognitive benefit that says that the
more you read, the more you can read. Reading researchers call it the Matthew
Effect, in which those who acquire reading skills in childhood read and learn later in
life at a faster pace than those who do not. They have a larger vocabulary, which means
that they dont stumble with more difficult texts, and they recognize better the pacing
of stories and the form of arguments, an aptitude that doesnt develop as effectively
through other media. Its like exercising. Go to the gym three times a week and the
sessions are invigorating. Go to the gym three times a month and theyre painful. As the
occasions of reading diminish, reading becomes a harder task. A sinister corollary to the
cognitive benefit applies: the more you dont read, the more you cant read.
The Changing Nature of How We Read
John Freeman, The Tyranny of E--Mail (2009)
Eye-tracking studies have shown that people increasingly tend to leapfrog over long blocks of
text. We need bullet points, bold text, short sentences, explanatory subheads, and speedy text.
People skim and scan rather than rummage down into the belly of the beast. Online readers are
selfish, lazy, and ruthless, said Jakob Nielsen, a usability engineer. Young people, in particular,
are developing the ability to get the gist of an entire area of study with just a moment of
interaction with it. With a channel surfers skill, they are able to experience a book, movie, or
even a scientific process almost intuitively. For them, hearing a few lines of T. S. Eliot, seeing one
geometric proof, or looking at a picture of an African mask leaves them with a real, albeit
oversimplified, impression of the world from which it comes. This works especially well for areas
of art and study that are fractal or holographic in nature, where one tiny piece reflects the
essence of the whole. By recognizing that our engagements through and with the digital world
tend to reduce the complexity of our real world, we lessen the risk of equating these
oversimplified impressions with real knowledge and experience. The digital information gatherer
tends to have the opposite approach to knowledge as his text-based ancestors, who saw
research as an excuse to sit and read old books. Instead, net research is more about engaging
with data in order to dismiss it and move on like a magazine one flips through not to read, but
to make sure theres nothing that has to be read. Reading becomes a process of elimination
rather than deep engagement. Life becomes about knowing how not to know what one doesnt
have to know.
Grazing, "Deep Dives," and A Feedback Loop
John Palfrey, Born Digital: Understanding the First Generation of Digital Natives (2008)
Digital Natives gather information through a multistep process that involves grazing, a deep
dive, and a feedback loop. They are perfecting the art of grazing through the huge amount of
information that comes their way on a daily basis. Imagine an eighteen-year-old college
freshman interested in the Middle East. (Yes, many Digital Natives are interested in public affairs
in regions other than their own.) Her boyfriend comes from an Arabic- speaking family, and she
is hoping to travel to Egypt next summer. When she opens her browser, Google is her home
page. It features headlines from sources that she has preselected, on topics of her choosing. She
might even have plugged keywords into Google or Technorati (a similar service that primarily
tracks blogs) so that those services could send her alerts when relevant stories appear. She
grazes all day through the news feeds that she sees on her Facebook profile, posted by friends or
others. She might see headlines about the region by grazing through news from major news
outlets online (CNN, MSNBC, the New York Times, Al-Jazeera, and so forth). Shell also probably
have a few favorite specialized websites or discussion boardsfor instance, Mideastyouth.com
which shell glance at in the course of the day. Chat rooms and e-mail listservs might serve a
similar function.

In-class resources for Thu Oct 29 (class 12)


Texts Without Context
Digerati video and reading
(discussion leader: x)
Digerati
David Shields
@_davidshields
https://en.wikipedia.org/wiki/David_Shields
Video
We will watch and discuss this in class.
David Shields on Colbert Report (5:04)
http://thecolbertreport.cc.com/videos/ohefue/david-shields
Reading
We will read and discuss this in class.
Reality Hunger
(David Shields, 2010)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)

Chapter H from Reality Hunger


David Shields, Reality Hunger: A Manifesto (2010)
238
The contemporary vogue of not tucking in your shirttail (which I follow): a purposeful confusion
of the realms.
239
Living as we perforce do in a manufactured and artificial world, we yearn for the "real,"
semblances of the real. We want to pose something nonfictional against all the fabrication
autobiographical frissons or framed or filmed or caught moments that, in their seeming
unrehearsedness, possess at least the possibility of breaking through the clutter. More invention,
more fabrication aren't going to do this. I doubt very much that I'm the only person who's finding
it more and more difficult to want to read or write novels.
240
The mimetic function in art hasn't so much declined as mutated. The tools of metaphor have
expanded. As the culture becomes more saturated by different media, artists can use larger and
larger chunks of the culture to communicate. Warhol's Marilyn Monroe silk screens and his
Double Elvis work as metaphors because their images are so common in the culture that they
can be used as shorthand, as other generations would have used, say, the sea. Marilyn and Elvis
are just as much a part of the natural world as the ocean and a Greek god are. Anything that
exists in the culture is fair game to assimilate into a new work, and having preexisting media of
some kind in the new piece is thrilling in a way that "fiction" can't be.
241
The body gets used to a drug and needs a stronger dose in order to experience the thrill. An
illusion of realitythe idea that something really happenedis providing us with that thrill right
now. We're riveted by the (seeming) rawness of something that appears to be direct from the
source, or at least less worked over than a polished mass- media production.
242
Our culture is obsessed with real events because we experience hardly any.
243
We're overwhelmed right now by calamitous information. The real overwhelms the fictional, is
incomparably more compelling than an invented drama.
244
I'm finding it harder to just "write." The seeking and sculpting of found text or sound have
become my primary "artistic" function. Actually generating that text or music seems increasingly
difficult. Lately I'll sit down with a blank pad and feel like I really have to dig down deep to get
my own voice to come out over the "sample choir." It's a very strange feeling, like a conductor
trying to sing over the orchestra, and is, I believe, a fairly new one for artists.
245
The culture disseminates greater and greater access to the technology that creates various
forms of media. "Ordinary" people's cult of personal celebrity is nurtured by these new modes of
communication and presentation and representation. We're all secretly practicing for when we,
too, will join the ranks of the celebrated. There used to be a monopoly on the resources of
exposure. The rising sophistication of the nonexpert in combination with the sensory overload of
the culture makes reality-based and self-reflexive art appealing now. There are little cracks in the
wall, and all of us "regular" people are pushing through like water or, perhaps, weeds.
246
Kathy Griffin, for example, now acts out her own reality show, My Life on the D-List, free from
the constraints of a network time slot or a staged setting like a boardroom or desert island.
247
We are now, officially, lost.

Sunday Story #7
Due Sun Nov 1 (before midnight)
Prompt

vimeo-screenshot cheating

Homework for Thu Nov 5 (class 13)


Is technology shifting our moral compass?
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: Why are more Americans cheating than ever before?
Video
Watch the assigned video on Zaption and post a one-sentence response.
Is technology shifting our moral compass? (2:46)
http://zapt.io/tz3zemd3
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Is there a relationship between making an illegal copy of movies and/or
music and cheating in school?
link

Why Are More Americans Cheating Than Ever Before?


David Callahan, The Cheating Culture: Why More Americans are Doing Wrong to get Ahead
(2004)
New Pressures. In today's competitive economy, where success and job security can't be taken
for granted, it's increasingly tempting to leave your ethics at home every morning. Students are
cheating more now that getting a good education is a matter of economic life and death.
Lawyers are overbilling as they've been pushed to bring in more money for the firm and as it's
gotten harder to make partner. Doctors are accepting bribes from drug-makers, as HMOs have
squeezed their incomes. The list goes on. You can even see this problem among cabdrivers in
some cities. As cabdrivers have gone from salaried workers with steady incomes to "free agents"
who rent their taxis and have to hustle to make a living, they've been feeling new pressures to
pick up and drop off as many fares as possible every day. And big surprise: They're speeding and
running more red lights.
Bigger Rewards for Winning. As the prizes for the winners have increased, people have
become more willing to do whatever it takes to be a winner. A CEO will inflate earnings reports to
please Wall Streetand increase the value of his stock options by $50 million. An A student will
cheat to get the A+ that she believes, correctly, could make the difference between Harvard and
a lifetime of big opportunitiesor NYU and fewer opportunities. A steady .295 hitter will take
steroids to build the muscles needed to be a sluggerand make $12 million a year instead of a
mere $3 million. A journalist will fabricate sources in his quest to write as many hit pieces as
possibleso that the day arrives sooner rather than later when he can command six-figure book
deals and get lucrative lecture gigs. Twenty-five years ago, many of the huge rewards being
dangled in front of professionals didn't exist in a society with less wealth and a stronger sense of
fairness. But in the '80s and '90s we came to live in a society where lots of people were striking
it rich left and rightand cutting corners made it easier to do so.
Temptation. Temptations to cheat have increased as safeguards against wrongdoing have
grown weaker over two decades of deregulation and attacks on government. Many of the recent
instances of greed and investor betrayal on Wall Street, for example, could have been prevented
by reforms intended to keep accountants honest or to ensure the independence of stock
analysts, or to stop corporate boards from being packed with cronies, or to keep companies from
handing out so many stock options. Reformers tried to enact such measures for years, only to be
blocked by powerful special interests and antigovernment zealots.
Trickle-down Corruption. What happens when you're an ordinary middle-class person
struggling to make ends meet even as you face relentless pressures to emulate the good life you
see every day on TV and in magazines? What happens when you think the system is stacked
against people like you and you stop believing that the rules are fair? You just might make up
your own moral code. Maybe you'll cheat more often on your taxes, anxious to get a leg up
financially and also sure that the tax codes wrongly favor the rich. Maybe you'll misuse your
expense account at work to afford a few little luxuries that are out of reach on your salaryand
you'll justify this on the grounds that the people running your company are taking home huge
paychecks while you're making chump change. Maybe you'll lie to the auto insurance company
about a claim or about having a teenage driver in the house, convinced that the insurer has
jacked up your rates in order to increase their profitsthen again, maybe you have nothing
against insurance companies but the payments on that flashy new SUV you just had to have are
killing you and you're desperate for any kind of relief.
In theory, there is limitless opportunity in America for anyone willing to work hard, and it seemed
during the boom of the '90s that everyone could get rich. The reality is that a lot of families
actually lost ground during the past two decades. Middle-class Americans are both insecure and
cynical these daysa dangerous combinationand many feel besieged by material expectations
that are impossible to attain. It shouldn't come as a surprise that more people are leveling the
playing field however they see fit.

Netgeners on stealing music


Don Tapscott, Grown Up Digital: How the Net Generation is Changing Your World (2010)
We asked Net Geners: "Do you steal music?
Tony, 25, Systems Analyst: Yes I download music from the Internet without payment or borrow
tunes from friendshowever, I do purchase music using iTunes on occasion. This does constitute
stealing because you are taking something you do not have the right to. I'm completely
detached from the "victim"in this case, massive corporations in the music industry. Does that
make it right? No. That is why nowadays I make the utmost effort to pay for all digital media that
I feel is of high quality and worthy of payment.
Morris, 23, Marketing Manager: Yes. I'm a thief. And so is everyone else I know. I do believe
however that the definition of music ownership (and the transfer of ownership) is outdated. It
just doesn't fit for our generation. I guess when we come to power we'll redefine what theft is.
Hopefully we'll also come up with a new model so songwriters, artists, and others that actually
create some value get properly compensated.
Graham, 24, Management Consultant: The manner in which the industry generates revenue
from customers needs to better incorporate value derived from concert tours, merchandise, and
placement in mediums such as ads, ringtones, television, movies, or video games. The channels
through which people discover, obtain, appreciate, and consume music has shifted from the
past; yet the music industry has been slow to react and adapt.
Carolina, 27, Consultant: I don't feel that it constitutes stealing to download music without
payment or to borrow tunes from friends. If anything, I believe that this promotes new types of
music that I wouldn't have otherwise been exposed to. If I am introduced to an artist that I really
enjoy I will go out and buy the CD or download the album. I feel extremely lucky to have grown
up in a time when Napster was first available to flood my computer with free music.
Alex, 22, Student: I don't have moral certainty about this issue. I pay for music on iTunes but I
go onto LimeWire to download remixes and other things I can't find on iTunes. In the end,
though, price matters to me. I can't afford to download 100 to 200 songs a month from iTunes'
music store.
Alan, 23, Risk Analyst: I am completely comfortable stealing music. I believe this stems
primarily from my early experiences with Napster, and the complete disconnect between the joy
I felt downloading (and listening) to music, and any sense (or perceived existence) of downside
risk. The rules may be clearer now, but my view of music downloading gestated when there was
no transparent and consistent approach to intellectual-property laws and enforcement.
Morgan, 23, Video Games Developer: No, I do not download directly from the Internet
without paying, mainly because I got sick of dealing with bad downloads and viruses embedded
in the programs. I do however "borrow" music from friends. I do not think it is stealing because if
they got it why can't they share it with others; same deal with letting a friend watch a video you
rented, reading a book you bought, or eating half your lunch.
Joanna, 24, Publicist: No. There has to be some form of payment for the music. Whether it
means that you buy a concert ticket to the artist's show, pick up a T-shirt, etc., it doesn't really
matter so long as something is being given back so that the creative process can continue. Music
is many artists' livelihood and if they aren't monetizing from that livelihood in one way or
another then we are robbing them of their trade and ourselves of some potentially kickass art.
Graham, 24, Management Consultant: Yes. As for why, I'll start with the observation that a
160GB iPod, sadly, does not fill itself. I think that downloading without payment, and "borrowing"
from friends, has become such second nature that in the minds of many it is likely viewed as the
legal equivalent of exceeding the speed limit or crossing against a light on an empty street.

From "Don't Cheat" to "Don't Get Caught"


Kathleen Foss, Student Cheating and Plagiarism in the Internet Era (2000)
We know students are cheating more often today; their cheating techniques are increasingly
sophisticated, and many express guilt or remorse only if they are caught. Why do they cheat?
The bottom line seems to be:
it's easy, especially with new technologies fewer than 10% are caught, and most of those who
are caught get off without serious penalty. The byword appears to have changed from Don't
cheat to Don't get caught.
How Students Cheat
Kathleen Foss, Student Cheating and Plagiarism in the Internet Era (2000)
Students in a class can now create an "underground" Internet Web site to share homework or
other assignments, answer questions for one another, and post copies of old tests with answers.
When properly managed according to guidelines set by the teacher, such a site can be a
valuable study aid. Its clandestine use by students to avoid doing their own homework and
assignments is cheating.
Cheaters Paradise at <http://www.jaberwocky.com/cheat/index2.html> and similar free Web sites
on the Internet tell students how to cheat and offer them opportunities to brag online about their
successes.
Chat rooms exist on the Internet for almost any subject imaginable and many students are
willing to trade papers (Bushweller. Digital, par. 7). Complete papers posted to the chat room by
participants can be copied or long strings from the discussions can be worked into an "original"
paper.
Other sites exist where "students can submit math homework problems to a resident math whiz
and online message boards where students with very specific needs can help each other". These
sites can fill a positive need when they provide legitimate assistance to students. They unfairly
improve students' grades when work actually has been completed or precorrected by someone
else.
Students use free translation programs on the Internet to do their homework in foreign language
classes, writing an essay or story in English and then having the software translate it into the
language of their class.
The digital revolution is creating a generation of cut-and-paste burglars
Andrew Keen, The Cult of the Amateur (2008)
The Judeo-Christian ethic of respecting others property that has been central to our society
since the countrys founding is being tossed into the delete file of our desktop computers. The
pasting, remixing, mashing, borrowing, copyingthe stealingof intellectual property has
become the single most pervasive activity on the Internet. And it is reshaping and distorting our
values and our very culture. The breadth of todays mass kleptocracy is mind-boggling. Im not
referring only to the $20 billion pilfered and pickpocketed, day by day, from the music industry or
the $2.3 billion and growing from the movie industry. Sadly, the illegal downloading of music and
movies has become so commonplace, so ordinary, that even the most law-abiding among us,
like Brianna LaHara, now do it without thinking. How are we supposed to know its illegal? asks
a bookkeeper in Redwood City, California, as he copied a playlist of songs to give out to his
friends as a party favor.
The problem is not just pirated movies and music. Its become a broader quandary over whoowns-what in an age when anyone, with the click of a mouse, can cut and paste content and
make it their own. Web 2.0 technology is confusing the very concept of ownership, creating a
generation of plagiarists and copyright thieves with little respect for intellectual property. In
addition to stealing music or movies, they are stealing articles, photographs, letters, research,
videos, jingles, characters, and just about anything else that can be digitized and copied
electronically. Our kids are downloading and using this stolen property to cheat their way

through school and university, passing off the words and work of others as their own in papers,
projects, and theses.
A June 2005 study by the Center for Academic Integrity (CAI) of 50,000 students didnt think that
Internet plagiarism was a serious issue. This disturbing finding gets at a grave problem in
terms of Internet and culture: The digital revolution is creating a generation of cut-and-paste
burglars who view all content on the Internet as common property.

Rip, Mix and Burn Culture


John Palfrey, Born Digital: Understanding the First Generation of Digital Natives (2008)
Mash-ups, fan fiction, and sampling: Each is a way of creating art based upon the works of
others. The law labels these new art forms derivative works, meaning that they are new works
derived from the copyrighted creativity of people who came before. What they have in common
is that they build on existing creative works, like songs, videos, and text, to form a new creation.
This rip, mix, and burn culturewith a hat-tip to Apple for the sloganis at the core of the
unfolding creative revolution in cyberspace.
These new creative forms are inherently in tension with existing copyright laws, and it is hardly
surprising that they have garnered the attention of legal departments in big media companies.
YouTube has been sued by Viacom for alleged copyright infringement by YouTubes users, many
of whom have posted segments of television programs online without permission. Sometimes,
these postings are straight rip-offs of the original files. Other times, they are creative
rearrangements of songs, texts, pictures, and movies. These practicescreative and noncreative
alikeare already generating litigation, and we can expect much more litigation before the legal
issues surrounding these derivative works become clear. For the time being, this means that the
way Digital Natives are interacting with digital media leaves them at risk for ongoing copyright
liability.
There are qualitative issues as well, of course. Many parents and teachers worry that the
Internet, with its rip, mix, and burn culture, only fosters those forms of creation that are based
on the practices of mixing and mashing, while neglecting other, more original modes of
creativity. There can be little doubt that a large portion of the user-created content is based on
previous work; in that sense, some of these derivative works arent particularly creative. But
that critique ignores the extent to which creators of all sorts inevitably build on the shoulders of
others.
A national problem of internet-fueled plagiarism
Bridget Murray, American Psychological Association (2002)
There is a national problem of Internet-fueled plagiarism, says professor emeritus Stephen Davis,
PhD, who conducts research on academic dishonesty. Plagiarism rates are high, and they appear
to be rising. Studies peg rates of academic dishonesty at 40 to 60 percent at larger universities.
And roughly 70 percent of professors handle at least one plagiarism case a year, according to
the Center for Academic Integrity.
Additional consequence of student cheating
Brian Hansen, Congressional Quarterly Press (2003)
There are non-economic repercussions as well, says Barrie, of TurnItIn.com. A lot of students
bust their derrires to get into the best university or medical school or law school, but some get
out-competed by students who cheat, he says. I have zero sympathy for that. Students should
be held accountable for what they do.
Moreover, if plagiarism were allowed to go unchecked, the impact on society could be
catastrophic, according to Lawrence M. Hinman, director of the Values Institute at the University
of San Diego. Hinman says trust is fundamental to the social, political and economic fabric of any

successful society. Without trust in public and business institutions outside the family, an
economy stops developing after a certain point, he says.
What rules should we seek to enforce (and why)?
Lawrence Lessig, Remix: Making Art and Commerce Thrive in the Hybrid Economy (2008)
I stand by my position that piracy is wrong. However, I ask whether we want to make this
mistake again. Should the next ten years be another decade-long war against our kids? Should
we spend more of our resources hiring lawyers and technologists to build better weapons to
wage war against those practicing RW culture? Have we learned nothing from the total failure of
policy that has defined copyright policy over the last decade? I believe this for the same reason
the content industry is so keen to enforce copyright. As the RIAAs Mitch Bainwol and Cary
Sherman explained: Its not just the loss of current sales that concerns us, but the habits formed
in college that will stay with these students for a lifetime. This is a teachable momentan
opportunity to educate these particular students about the importance of music in their lives and
the importance of respecting and valuing music as intellectual property. Exactly right. So what
rules should we work so hard to enforce? The argument in favor of reforming our legal attitude
toward remixing is a thousand times stronger than in the context of p2p file sharing: this is a
matter of literacy. We should encourage the spread of literacy here, at least so long as it doesnt
stifle other forms of creativity. There is no plausible argument that allowing kids to remix music is
going to hurt anyone. Until someone can show that it will, the law should simply get out of the
way. We need to decriminalize creativity before we further criminalize a generation of our kids.
"The Remix is the Very Nature of the Digital"
Andrew Keen, The Cult of the Amateur (2008)
Silicon Valley visionary and cyberpunk author William Gibson wrote in the July 2005 issue of
Wired magazine: "Our culture no longer bothers to use words like appropriation or borrowing.
Today's audience isn't listening at all, it's participating. Indeed, audience is as antique a term as
record, the one archaically passive, the other archaically physical. The record, not the remix, is
the anomaly today. The remix is the very nature of the
digital."

In-class resources for Thu Nov 5 (class 13)


Is technology shifting our moral compass?
Digerati video and reading
(discussion leader: x)
Digerati
Evgeny Morozov
@evgenymorozov
https://en.wikipedia.org/wiki/Evgeny_Morozov
Video
We will watch and discuss this in class.
Evgeny Morozov RSA: The Internet in Society (10:54)
https://www.thersa.org/discover/videos/rsa-animate/2011/03/rsa-animate---the-internet-in-society/
Reading
We will read and discuss this in class.
Excerpts: Evgeny Morozov
The rise of data and the death of politics
(Evgeny Morozov, The Guardian, 2014)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Technology and cheating
Evgeny Morozov

The rise of data and the death of politics


Tweets can topple governments
Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (2011)
Daniel Kimmage, a senior analyst with Radio Free Europe / Radio Liberty, argues that unfettered
access to a free Internet is ... a very practical means of countering Al Qaeda...As users
increasingly make themselves heard, the ensuing chaos ... may shake the online edifice of Al
Qaedas totalitarian ideology.
Riccardo Luna, the editor of the Italian edition, proposed that the Internet is a first weapon of
mass construction, which we can deploy to destroy hate and conflict and to propagate peace and
democracy. Chris Anderson, the editor of the original American edition, opined that while a
Twitter account may be no match for an AK-47 ... in the long term the keyboard is mightier than
the sword. David Rowan, the editor of the British edition, argued that the Internet gave all of us
the chance to take back the power from governments and multinationals. It made the world a
totally transparent place. And how can a totally transparent world fail to be a more democratic
world as well?
In his 2003 book Breaking the Real Axis of Evil: How to Oust the Worlds Last Dictators by 2025,
his guide to overthrowing forty-five of the worlds authoritarian leaders, a book that makes Dick
Cheney look like a dove, author Mark Palmer lauded the emancipatory power of the Internet,
calling it a force multiplier for democracy and an expense multiplier for dictators. For him, the
Internet is an excellent way to foster civil unrest that can eventually result in a revolution:
Internet skills are readily taught, and should be, by the outside democracies. Few undertakings
are more cost effective than training the trainers for Internet organizing. The Web is thus a
powerful tool for regime change; pro-democracy activists in authoritarian states should be
taught how to blog and tweet in more or less the same fashion that they are taught to practice
civil disobedience and street protest."
The Internet as a Tool for Dissent
Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (2011)
It may help to examine the ways in which the Internet has helped dissidents to conceal
antigovernment activities. First, sensitive data can now be encrypted on the cheap, adding an
extra level of protection to conversations between dissidents. Even though decryption is
possible, it can eat a lot of government resources. This is particularly true when it comes to voice
communications. While it was relatively easy to bug a phone line, this is not such an easy option
with voice-over-the-Internet technology like Skype. (The inability to eavesdrop on Skype
conversations bothers Western governments, too: In early 2009 the U.S. National Security
Agency was reported to have offered a sizeable cash bounty to anyone who could help them
break Skypes encrypted communications; to date no winners have been announced.)
Second, there is so much data being produced online that authorities cannot possibly process
and analyze all of it. Comparable estimates for the developing world are lacking, but according
to a 2009 study by researchers at the University of California at San Diego, by 2008 the
information consumption of an average American reached thirty-four gigabytes of data per day,
an increase of 350 percent compared to 1980. The secret police have no choice but to
discriminate; otherwise, they may develop a severe case of attention deficit disorder, getting
bogged down in reading millions of blogs and Twitter updates and failing to see the big picture.
Third, technologies like Tor now make it possible to better protect ones privacy while surfing the
Internet. A popular tool that was initially funded by the U.S. Navy but eventually became a
successful independent project, Tor allows users to hide what it is they are browsing by first
connecting to a random proxy node on the volunteer Tor network and then using that nodes
Internet connection to connect to the desired website.
Shhh...
Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (2011)

A dangerous self-negating prophecy is at work here: The more Western policymakers talk up the
threat that bloggers pose to authoritarian regimes, the more likely those regimes are to limit the
maneuver space where those bloggers operate. In some countries, such politicization may be for
the better, as blogging would take on a more explicit political role, with bloggers enjoying the
status of journalists or human rights defenders. But in many other countries such politicization
may only stifle the nascent Internet movement, which could have been be far more successful if
its advocacy were limited to pursuing social rather than political ends.
(Authoritarian governments like China have started to) aggressively engage with new media
themselves, paying bloggers to spread propaganda and troll social networking sites looking for
new information on those in the opposition.
Activism or "Slactivism"?
Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (2011)
A good way to tell whether a digital campaign is serious or slacktivist is to look at what it
aspires to achieve. Campaigns of the latter kind seem to be premised on the assumption that,
given enough tweets, the worlds problems are solvable; in the language of computer geeks,
given enough eyeballs, all bugs are shallow. This is precisely what propels so many of these
campaigns into gathering signatures, adding new members to their Facebook pages, and asking
everyone involved to link to the campaign on blogs and Twitter. This works for some issues,
especially those that are geography bound (e.g., performing group community service at a local
soup kitchen, campaigning against a resolution passed by a local town council, etc.). But with
global issues, whether its genocide in Darfur or climate change, there are diminishing returns to
awareness raising. At some point one must convert awareness into action, and this is where tools
like Twitter and Facebook prove much less successful. Not surprisingly, many of these Facebook
groups find themselves in a waiting for Godot predicament: Now that the group has been
formed, what comes next? In most cases, what comes next is spam. Most of these campaigns
remember many of them, like the anti-FARC campaign in Colombia, pop up spontaneously
without any carefully planned course of actiondo not have clear goals beyond awareness
raising. Thus, what they settle on is fund-raising. But its quite obvious that not every problem
can be solved with an injection of funds. If the plight of sub-Saharan Africa or even Afghanistan is
anything to judge by, money can only breed more trouble unless endemic political and social
problems are sorted out first.
"The Ringleman Effect"
Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (2011)
In 1882 Ringelmann conducted an experiment in which he asked four individuals to pull on a
rope, first alone and then in groups, and then compared the results. The rope was attached to a
strain gauge so it was possible to measure the pull force. To Ringelmanns surprise, the total pull
force of the group pull was consistently less than the sum of the individual pull forces, even as
he adjusted the number of individuals participating in the experiment. What has become known
as the Ringelmann Effect is thus the opposite of synergy.
In the century that has passed since Ringelmanns original experiment, plenty of other tests
have proven that we usually put much less effort into a task when other people are also doing it
alongside us. In fact, calling it the Ringelmann Effect is only adding theoretical luster to what we
already knew intuitively. We dont have to make fools of ourselves by singing Happy Birthday
at the top of our lungs; others will do the job just fine. Nor do we always clap our hands as loudly
as we couldmuch to the disappointment of performers. The logic is clear: When everyone in
the group performs the same mundane tasks, its impossible to evaluate individual contributions,
and people inevitably begin slacking off (its for this reason that another name for this
phenomenon is social loafing). Increasing the number of participants diminishes the relative
social pressure on each and often results in inferior outputs.
Hearing of Ringelmanns experiments today, one cant help noticing the parallels to much of
todays Facebook activism. With the power of Facebook and Twitter at their fingertips, many
activists may choose to tackle a problem collectively when tackling it individually would make
more strategic sense. But just as the madness of crowds gives rise to the wisdom of crowds

only under certain, carefully delineated social conditions, social loafing leads to synergy only
once certain conditions are met.
China and its "firewall"
Eli Pariser, The Filter Bubble: What the Internet is Hiding From You (2011)
In practice, the firewall is not so hard to circumvent. Corporate virtual private networksInternet
connections encrypted to prevent espionageoperate with impunity. Proxies and firewall
workarounds like Tor connect in-country Chinese dissidents with even the most hard-core
antigovernment Web sites. But to focus exclusively on the firewalls inability to perfectly block
information is to miss the point. Chinas objective isnt so much to blot out unsavory information
as to alter the physics around itto create friction for problematic information and to route
public attention to progovernment forums. While it cant block all of the people from all of the
news all of the time, it doesnt need to.
What the government cares about, Atlantic journalist James Fallows writes, is making the
quest for information just enough of a nuisance that people generally wont bother. The
strategy, says Xiao Qiang of the University of California at Berkeley, is about social control,
human surveillance, peer pressure, and self-censorship. Because theres no official list of
blocked keywords or forbidden topics published by the government, businesses and individuals
censor themselves to avoid a visit from the police. Which sites are available changes daily. And
while some bloggers suggest that the systems unreliability is a result of faulty technology (the
Internet will override attempts to control it!), for the government this is a feature, not a bug.
James Mulvenon, the head of the Center for Intelligence Research and Analysis, puts it this way:
Theres a randomness to their enforcement, and that creates a sense that theyre looking at
everything.

Sunday Story #8
Due Sun Nov 8 (before midnight)
Prompt

medium-magazine

Homework for Tue Nov 10 (class 14)


Digital Imperialism I
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
The Tech & Design Issue
(NYT Magazine, June 7, 2015)
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Show and Tell: Magazine
link

In-class resources for Tue Nov 10 (class 14)


Digital Imperialism I
Digerati video and reading
(discussion leader: x)
Digerati
Astra Taylor
@astradisastra
https://en.wikipedia.org/wiki/Astra_Taylor
Video
We will watch and discuss this in class.
Astra Taylor: Democracy Now (4:22)
http://www.democracynow.org/2014/4/25/utopian_potential_of_the_internet_astra
Clay Shirky: TED (18:33watch part of this video)
http://zapt.io/tw7mw33y
Reading
We will read and discuss this in class.
The Peoples Platform: Taking Back Power and Culture in the Digital Age
(Astra Taylor, 2014)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Digital imperialism
Political slactivism and the web
Social medias effect on politics
How technology fuels political activism
How governments use technology to repress political activism
Clay Shirky

Homework for Thu Nov 12 (class 15)


Digital Imperialism II
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Articles: Hate speech and the web
Video
Watch the assigned video on Zaption and post a one-sentence response.
NYT: Inside Charlie Hebdo (3:03)
http://zapt.io/trhdn2rp
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Would you have approved of the publication of the Muhammad cartoons?
(resolution in class)
link

Senator Edward Markey (D-Mass) proposed a bill in Congress:


"The Hate Crime Reporting Act of 2014"
Below is a copy of a press release announcing the proposed legislation:
Sen. Markey and Rep. Jeffries Introduce Legislation To Examine and Prevent
the Promotion of Hate Crimes and Hate Speech in Media
Wednesday, April 16, 2014
Boston (April 16, 2014) Senator Edward J. Markey (D-Mass.), a member of the Commerce,
Science and Transportation Committee, introduced legislation to examine the prevalence of hate
crime and hate speech on the Internet, television, and radio to better address such crimes. The
Hate Crime Reporting Act of 2014 (S.2219) would create an updated comprehensive report
examining the role of the Internet and other telecommunications in encouraging hate crimes
based on gender, race, religion, ethnicity, or sexual orientation and create recommendations to
address such crimes.
In 1992, then-Rep. Markey, through the Telecommunications Authorization Act, directed the
National Telecommunications and Information Administration to examine the role of
telecommunications in encouraging hate crimes. Senator Markeys legislation will provide a
comprehensive updated report on the current prevalence of hate crimes and hate speech in
telecommunications, as the last report was conducted and submitted to Congress over two
decades ago, in December 1993. Congressman Hakeem Jeffries (D-N.Y.) introduced a companion
bill in the House of Representatives, H.R. 3878.
We have recently seen in Kansas the deadly destruction and loss of life that hate speech can
fuel in the United States, which is why it is critical to ensure the Internet, television and radio are
not encouraging hate crimes or hate speech that is not outside the protection of the First
Amendment, said Senator Markey. Over 20 years have passed since I first directed the NTIA to
review the role that telecommunications play in encouraging hate crimes. My legislation would
require the agency to update this critical report for the 21st century.
A copy of the legislation can be found HERE.
The Internet has proven to be a tremendous platform for innovation, creativity
and entrepreneurship. However, at times it has also been used as a place where
vulnerable persons or groups can be targeted, said Rep. Jeffries. I commend Senator Markey
for his longstanding leadership with respect to combating Hate Crimes in America. He
understands that in the digital era it is important to comprehensively evaluate the scope of
criminal and hateful activity on the Internet that occurs outside of the zone of First Amendment
protection. With the introduction of Senator Markeys bill, we have taken a substantial step
toward addressing this issue.
I thank Senator Markey for his career-long commitment to ensuring that we have the data
necessary to confront and combat hate speech in the media that targets our most vulnerable
communities, said President & CEO of the National Hispanic Media Coalition Alex Nogales.
NHMC has long-recognized that an update to the National Telecommunications and Information
Administration's 1993 report, The Role of Telecommunications in Hate Crimes, is long overdue
and desperately needed given the incredible evolution of our communications systems over the
past 21 years as well as the ever-increasing numbers of hate crimes targeting Latinos and
others. As the author of the original piece of legislation directing the 1993 report, there is
nobody better than Senator Markey to join Congressman Hakeem Jeffries and others in calling on
the NTIA to study this pressing issue once again.

Below, is an editorial criticizing Markey's proposed legislation.


Where do you come out on this issue?
Should "The Hate Crime Reporting Act of 2014" be enacted into law?
Markeys mission creep
Boston Herald (Opinion Page)
April 24, 2014
U.S. Sen. Ed Markey wants to empower an obscure federal agency to begin scouring the Internet,
TV and radio for speech it finds threatening a plan met with jeers from defenders of the First
Amendment. Here we add one more incredulous voice to the chorus.
The Hate Crime Reporting Act of 2014 presents a frankly chilling proposition. The spookilynamed National Telecommunications and Information Administration (what, youve never heard
of it?) would be required to submit a report to Congress on the use of telecommunications to
advocate and encourage violent acts and the commission of crimes of hate.
Using its own judgment to determine what qualifies as impermissible speech, the new
government hall monitors would then recommend steps for Congress to take that are
appropriate and necessary to address such use of telecommunications. Now, those recs must
be consistent with the First Amendment, the bill says and Markey insists.
But prosecutors already have the authority to prosecute threats. And for the life of us we cant
fathom any further government limit on Internet postings or talk radio callers that could be
structured to protect an Americans right to free expression. Neither can the experts.
This proposed legislation is worse than merely silly. It is dangerous, civil liberties lawyer Harvey
A. Silverglate said. It is not up to Sen. Markey, nor to the federal government, to define for a
free people what speech is, and is not, acceptable.
Either silly, dangerous or both a Markey specialty. Inexplicably unchallenged in his bid for a full
Senate term the states junior senator clearly needs something to occupy his time. Perhaps he
could crack a briefing book on the crisis in Ukraine rather than looking for his own extraconstitutional methods of punishing speech he finds unacceptable.

As Violence Spreads in Arab World, Google Blocks Access to Inflammatory Video


NYT (September 12, 2012)
Google said it decided to block the video in response to violence that killed four American
diplomatic personnel in Libya. The company said its decision was unusual, made because of the
exceptional circumstances. Its policy is to remove content only if it is hate speech, violating its
terms of service, or if it is responding to valid court orders or government requests. And it said it
had determined that under its own guidelines, the video was not hate speech.
Millions of people across the Muslim world, though, viewed the video as one of the most
inflammatory pieces of content to circulate on the Internet. From Afghanistan to Libya, the
authorities have been scrambling to contain an outpouring of popular outrage over the video and
calling on the United States to take measures against its producers.
Googles action raises fundamental questions about the control that Internet companies have
over online expression. Should the companies themselves decide what standards govern what is
seen on the Internet? How consistently should these policies be applied?
Google is the worlds gatekeeper for information so if Google wants to define the First
Amendment to exclude this sort of material then theres not a lot the rest of the world can do
about it, said Peter Spiro, a constitutional and international law professor at Temple University in
Philadelphia. It makes this episode an even more significant one if Google broadens the block.
He added, though, that provisionally, he thought Google made the right call. Anything that
helps calm the situation, I think is for the better.
Under YouTubes terms of service, hate speech is speech against individuals, not against groups.
Because the video mocks Islam but not Muslim people, it has been allowed to stay on the site in
most of the world, the company said Thursday. This video which is widely available on the
Web is clearly within our guidelines and so will stay on YouTube, it said. However, given the
very difficult situation in Libya and Egypt we have temporarily restricted access in both
countries.
Though the video is still visible in other Arab countries where violence has flared, YouTube is
closely monitoring the situation, according to a person briefed on YouTubes decision-making who
was not authorized to speak publicly. The Afghan government has asked YouTube to remove the
video, and some Google services were blocked there Thursday. Google is walking a precarious
line, said Kevin Bankston, director of the free expression project at the Center for Democracy and
Technology, a nonprofit in Washington that advocates for digital civil liberties.
On the one hand, he said, blocking the video sends the message that if you violently object to
speech you disagree with, you can get it censored. At the same time, he said, the decision to
block in those two countries specifically is kind of hard to second guess, considering the severity
of the violence in those two areas.
All Web companies that allow people to post content online Facebook and Twitter as well as
Google have grappled with issues involving content. The questions are complicated by the
fact that the Internet has no geographical boundaries, so companies must navigate a morass of
laws and cultural mores. Web companies receive dozens of requests a month to remove content.
Google alone received more than 1,965 requests from government agencies last year to remove
at least 20,311 pieces of content, it said.
Requests for content removal from United States governments and courts doubled over the
course of last year to 279 requests to remove 6,949 items, according to Google. Members of
Congress have publicly requested that YouTube take down jihadist videos they say incite
terrorism, and in some cases YouTube has agreed.

Google has continually fallen back on its guidelines to remove only content that breaks laws or
its terms of service, at the request of users, governments or courts, which is why blocking the
anti-Islam video was exceptional. Some wonder what precedent this might set, especially for
government authorities keen to stanch expression they think will inflame their populace.Free
Speech in the Age of YouTube
NYT News Analysis (September 22, 2012)
COMPANIES are usually accountable to no one but their shareholders.
Internet companies are a different breed. Because they traffic in speech rather than, say, corn
syrup or warplanes they make decisions every day about what kind of expression is allowed
where. And occasionally they come under pressure to explain how they decide, on whose laws
and values they rely, and how they distinguish between toxic speech that must be taken down
and that which can remain.
The storm over an incendiary anti-Islamic video posted on YouTube has stirred fresh debate on
these issues. Google, which owns YouTube, restricted access to the video in Egypt and Libya,
after the killing of a United States ambassador and three other Americans. Then, it pulled the
plug on the video in five other countries, where the content violated local laws.
Some countries blocked YouTube altogether, though that didnt stop the bloodshed: in Pakistan,
where elections are to be scheduled soon, riots on Friday left a death toll of 19.
The company pointed to its internal edicts to explain why it rebuffed calls to take down the video
altogether. It did not meet its definition of hate speech, YouTube said, and so it allowed the video
to stay up on the Web. It didnt say very much more.
That explanation revealed not only the challenges that confront companies like Google but also
how opaque they can be in explaining their verdicts on what can be said on their platforms.
Google, Facebook and Twitter receive hundreds of thousands of complaints about content every
week.
We are just awakening to the need for some scrutiny or oversight or public attention to the
decisions of the most powerful private speech controllers, said Tim Wu, a Columbia University
law professor who briefly advised the Obama administration on consumer protection regulations
online.
Google was right, Mr. Wu believes, to selectively restrict access to the crude anti-Islam video in
light of the extraordinary violence that broke out. But he said the public deserved to know more
about how private firms made those decisions in the first place, every day, all over the world.
After all, he added, they are setting case law, just as courts do in sovereign countries.
Mr. Wu offered some unsolicited advice: Why not set up an oversight board of regional experts or
serious YouTube users from around the world to make the especially tough decisions? Google has
not responded to his proposal, which he outlined in a blog post for The New Republic.
Certainly, the scale and nature of YouTube makes this a daunting task. Any analysis requires
combing through over a billion videos and overlaying that against the laws and mores of
different countries. Its unclear whether expert panels would allow for unpopular minority opinion
anyway. The company said in a statement on Friday that, like newspapers, it, too, made
nuanced judgments about content: Its why user-generated content sites typically have clear
community guidelines and remove videos or posts that break them.
Behind closed doors, Internet companies routinely make tough decisions on content.
Apple and Google earlier this year yanked a mobile application produced by Hezbollah. In 2010,
YouTube removed links to speeches by an American-born cleric, Anwar al-Awlaki, in which he
advocated terrorist violence; at the time, the company said it proscribed posts that could incite
violent acts.

Susan Benesch, who studies hate speech that incites violence, said it would be wise to have
many more explanations like this, not least to promote debate. They certainly dont have to,
said Ms. Benesch, director of the Dangerous Speech Project at the World Policy Institute. But we
can encourage them to because of the enormous power they have.
The companies point out that they obey the laws of every country in which they do business.
And their employees and algorithms vet content that may violate their user guidelines, which are
public.
YouTube prohibits hate speech, which it defines as that which attacks or demeans a group
based on its race, religion and so on; Facebooks hate speech ban likewise covers content that
attacks people on the basis of identity. Google and Facebook prohibit hate speech; Twitter does
not explicitly ban it. And anyway, legal scholars say, it is exceedingly difficult to devise a
universal definition of hate speech.
Shibley Telhami, a political scientist at the University of Maryland, said he hoped the violence
over the video would encourage a nuanced conversation about how to safeguard free expression
with other values, like public safety. Its really about at what point does speech becomes action;
thats a boundary that becomes difficult to draw, and its a slippery slope, Mr. Telhami said.
He cautioned that some countries, like Russia, which threatened to block YouTube altogether,
would be thrilled to have any excuse to squelch speech. Does Russia really care about this
film? Mr. Telhami asked.
International law does not protect speech that is designed to cause violence. Several people
have been convicted in international courts for incitement to genocide in Rwanda.
One of the challenges of the digital age, as the YouTube case shows, is that speech articulated in
one part of the world can spark mayhem in another. Can the companies that run those speech
platforms predict what words and images might set off carnage elsewhere? Whoever builds that
algorithm may end up saving lives.

Internet videos will insult your religion. Ignore them.


William Saletan Slate (slate.com, September, 2012)
Dear Muslims, Christians, Hindus, and Jews,
Youre living in the age of the Internet. Your religion will be mocked, and the mockery will find its
way to you. Get over it.
If you dont, whats happening this week will happen again and again. A couple of idiots with a
video camera and an Internet connection will trigger riots across the globe. Theyll bait you into
killing one another.
Stop it. Stop following their script.
Today, fury, violence, and bloodshed are consuming the Muslim world. Why? Because a bank
fraud artist in California offered people $75 a day to come to his house and act out scenes that
ostensibly had nothing to do with Islam. Then he replaced the audio, putting words in the actors
mouths, and stitched together the scenes to make an absurdly bad movie ridiculing the Prophet
Mohammed. He put out flyers to promote the movie. Nobodyliterally nobodycame to watch
it.
He posted a 14-minute video excerpt of the movie on YouTube, but hardly anyone noticed. Then,
a week ago, an anti-Muslim activist in Virginia reposted the video with an Arabic translation and
sent the link to activists and journalists in Egypt. An Egyptian TV show aired part of the video. An
Egyptian politician denounced it. Clerics sounded the alarm. Through Facebook and Twitter,
protesters were mobilized to descend on the U.S. embassy in Cairo. The uprising spread. The
U.S. ambassador to Libya has been killed, and violence has engulfed other countries.
When the protests broke out, the guy who made the movie claimed to be an Israeli Jew funded
by other Jews. That turned out be a lie. Now he says hes a Coptic Christian, even though Coptic
Christian leaders in Egypt and the United States despise the movie and want nothing to do with
him. Another guy who helped make the movie claims to be a Buddhist. The movie was made in
the United States, yet Sudanese mobs have attacked British and German embassies. Some
Egyptians targeted the Dutch embassy, mistakenly thinking the Netherlands was behind the
movie. Everyones looking for a group to blame and attack.
The men behind the movie said it would expose Islam as a violent religion. Now theyre pointing
to the riots as proof. Muslims are pre-programmed to rage and kill, says the movies promoter.
Islam is a cancer, says the director. According to the distributor, The violence that it caused in
Egypt is further evidence of how violent the religion and people are and it is evidence that
everything in the film is factual.
Congratulations, rioters. You followed the script perfectly. You did the propagandists work for
them.
And the provocations wont end here. Laws and censors wont protect you from them. Liberal
democracies allow freedom of expression. Our leaders and people condemn garbage like this
video, but we dont censor it. Even if we did, the diffusion of media technology makes
suppression impossible. The director of this movie was forbidden, under his bank-fraud probation
rules, from using computers or the Internet without approval. That didnt stop him. Nor did it stop
the Arabic-language distributor from reposting the video and disseminating it abroad.
Online propaganda is speech. But its also part of the global rise of lethal empowerment. Its
easier than ever to kill people. In Muslim countries, mass murderers favor bombs. In the United
States, they prefer guns. In Japan, theyve tried sarin nerve gas. The Oklahoma City bomber
used fertilizer. The Sept. 11 hijackers used box cutters and passenger planes. Then came the
letters filled with anthrax. Derision is that much harder to control. The spread of digital
technology and Internet bandwidth makes it possible to reach every corner of the globe almost
instantly with homemade video defaming any faith tradition. It can become an incendiary

weapon. But it has a weakness: It depends on you. Youre the detonator. If you dont cooperate,
the bomb doesnt explode.
This isnt just a Muslim problem, though thats been the pattern lately. On YouTube, you can find
videos insulting every religion on the planet: Jews, Christians, Hindus, Catholics, Mormons,
Buddhists, and more. Some clips are ironic. Others are simply disgusting. Many were posted to
bait one group into fighting another. The baiters are indiscriminate.
The promoter of the Mohammed movie founded a group that also protests at Mormon temples.
The hatred and bloodshed will go on until you stop taking the bait. Mockery of your prophet on a
computer with an Internet address somewhere in the world can no longer be your master. Nor
can the puppet clerics who tell you to respond with violence.
Lay down your stones and your anger. Go home and pray. God is too great to be troubled by the
insults of fools. Follow Him.

Google ordered to take down YouTube anti-Muslim video


San Jose Mercury News (2014)
Google must take down a controversial anti-Muslim video on YouTube that sparked protests
across the Muslim world because keeping it on the website violates the rights of an actress who
sued after she was duped into appearing in the film, a divided federal appeals court ruled
Wednesday.
In a 2-1 decision, the 9th U.S. Circuit Court of Appeals rejected Google's arguments that being
forced to take down the video, "Innocence of Muslims," would be a prior restraint that would
violate the company's First Amendment protections.
Actress Cindy Lee Garcia proved the need to remove the video from YouTube, the appeals court
concluded, in part because of ongoing death threats since it sparked violent protests after being
first aired by Egyptian television in 2012.
"This is a troubling case," Chief Judge Alex Kozinski wrote. "Garcia was duped into providing an
artistic performance that was used in a way she never could have foreseen."
Garcia sued after Google repeatedly rebuffed her pleas to take it down from YouTube. The actress
had been cast in a minor role in a film called "Desert Warrior," and paid $500 by director Mark
Basseley Youssef, but the movie never materialized, according to court papers.
The actress discovered her scene had instead been used in the anti-Muslim film, and her voice
dubbed over with an insult to the prophet Mohammed. The video generated worldwide attention,
including an ongoing political debate over whether it played a part in inciting the fatal attacks on
the U.S. embassy in Benghazi, Libya.
In her suit, Garcia maintained that YouTube's unrivaled popularity gave the film a broad audience
and that she had a right to get it removed because she had been misled by the director and
retained copyright protections to her artistic work.
Google argued that taking the video down from YouTube would be futile because it is now in
widespread circulation and that Garcia contributed to her own notoriety by filing the lawsuit. But
the 9th Circuit disagreed and called the latter argument "preposterous."
In a secret order filed last week, the 9th Circuit first notified Google that it must take down the
video from YouTube "or any other platforms under (its) control." That order was unsealed with
Wednesday's ruling.
Judge N. Randy Smith dissented, finding that Garcia did not have a clear protection against the
use of her acting work and that an injunction against Google goes too far. A Los Angeles federal
judge had previously sided with Google, refusing Garcia's bid for an injunction.
A Google spokesman said the company "strongly disagrees with this ruling and will fight it."
Google can ask the 9th Circuit to rehear the case with an 11-judge panel.
The ruling has sparked a debate on legal blogs and social media, with some free speech scholars
expressing concern that the 9th Circuit extended copyright claims too far at the expense of the
First Amendment.
"The idea that copyright is a tool that's going to be used to censor speech we don't like ... that's
very dangerous," said Julie Ahrens, director of copyright and fair use at Stanford University's
Center for Internet and Society. "It is a pretty stunning decision."
But M.Cris Armenta, Garcia's lawyer, called the ruling a "David versus Goliath victory."
In a statement, Garcia said she was grateful for the ruling. "I am a strong believer and supporter
of the First Amendment and have the right not to be associated with this hateful speech against
my will," she said.

In-class resources for Thu Nov 12 (class 15)


Digital Imperialism II
Digerati video and reading
(discussion leader: x)
Digerati
Eli Pariser
@elipariser
https://en.wikipedia.org/wiki/Eli_Pariser
Video
We will watch and discuss this in class.
Eli Pariser on Colbert Report (6:05)
http://thecolbertreport.cc.com/videos/d573ty/eli-pariser
Reading
We will read and discuss this in class.
The Filter Bubble-Facebook study-Medium story
(Eli Pariser, 2011)

Links
The Filter Bubble
The Filter Bubble: How the personalized web is changing what we read and how we think (Eli Pariser, 2011)
The trouble with the echo chamber online (NYT, 2011)
Your own facts (Evgeny Morozov review of "The Filter Bubble" (NYT, 2011)
Maybe the web is not as polarized as we thought (Slate, 2012)
Five ways out of filter bubbles (Nieman Journalism Lab, 2012)
Facebook study disputes theory of political polarization among users (NYT, 2015)
Facebook published a big new study on the Filter Bubble: Here's what it says (Eli Pariser, Medium, 2015)
Fun facts from the new Facebook Filter Bubble study (Eli Pariser, Medium, 2015)

Homework video and reading


(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Should technology companies censor hate speech?
Should government legislate against hate speech?
The Filter Bubble

The End of the Echo Chamber?


A study of 250 million Facebook users reveals the Web isnt as polarized as we
thought.
Farhad Manjoo, slate.com, 2012
Today, Facebook is publishing a study that disproves some hoary conventional wisdom about the
Web. According to this new research, the online echo chamber doesnt exist.
This is of particular interest to me. In 2008, I wrote True Enough, a book that argued that digital
technology is splitting society into discrete, ideologically like-minded tribes that read, watch, or
listen only to news that confirms their own beliefs. Im not the only one whos worried about this.
Eli Pariser, the former executive director of MoveOn.org, argued in his recent book The Filter
Bubble that Web personalization algorithms like Facebooks News Feed force us to consume a
dangerously narrow range of news. The echo chamber was also central to Cass Sunsteins thesis,
in his book Republic.com, that the Web may be incompatible with democracy itself. If were all
just echoing our friends ideas about the world, is society doomed to become ever more
polarized and solipsistic?
It turns out were not doomed. The new Facebook study is one of the largest and most rigorous
investigations into how people receive and react to news. It was led by Eytan Bakshy, who began
the work in 2010 when he was finishing his Ph.D. in information studies at the University of
Michigan. He is now a researcher on Facebooks data team, which conducts academic-type
studies into how users behave on the teeming network.
Bakshys study involves a simple experiment. Normally, when one of your friends shares a link
on Facebook, the site uses an algorithm known as EdgeRank to determine whether or not the link
is displayed in your feed. In Bakshys experiment, conducted over seven weeks in the late
summer of 2010, a small fraction of such shared links were randomly censoredthat is, if a
friend shared a link that EdgeRank determined you should see, it was sometimes not displayed
in your feed. Randomly blocking links allowed Bakshy to create two different populations on
Facebook. In one group, someone would see a link posted by a friend and decide to either share
or ignore it. People in the second group would not receive the linkbut if theyd seen it
somewhere else beyond Facebook, these people might decide to share that same link of their
own accord.
By comparing the two groups, Bakshy could answer some important questions about how we
navigate news online. Are people more likely to share information because their friends pass it
along? And if we are more likely to share stories we see others post, what kinds of friends get us
to reshare more oftenclose friends, or people we dont interact with very often? Finally, the
experiment allowed Bakshy to see how novel informationthat is, information that you
wouldnt have shared if you hadnt seen it on Facebooktravels through the network. This is
important to our understanding of echo chambers. If an algorithm like EdgeRank favors
information that youd have seen anyway, it would make Facebook an echo chamber of your own
beliefs. But if EdgeRank pushes novel information through the network, Facebook becomes a
beneficial source of news rather than just a reflection of your own small world.
Thats exactly what Bakshy found. His paper is heavy on math and network theory, but heres a
short summary of his results. First, he found that the closer you are with a friend on Facebook
the more times you comment on one anothers posts, the more times you appear in photos
together, etc.the greater your likelihood of sharing that persons links. At first blush, that
sounds like a confirmation of the echo chamber: Were more likely to echo our closest friends.
But heres Bakshys most crucial finding: Although were more likely to share information from
our close friends, we still share stuff from our weak tiesand the links from those weak ties are
the most novel links on the network. Those links from our weak ties, that is, are most likely to
point to information that you would not have shared if you hadnt seen it on Facebook. The links
from your close ties, meanwhile, more likely contain information you would have seen elsewhere
if a friend hadnt posted it. These weak ties are indispensible to your network, Bakshy says.
They have access to different websites that youre not necessarily visiting.

The fact that weak ties introduce us to novel information wouldnt matter if we only had a few
weak ties on Facebook. But it turns out that most of our relationships on Facebook are pretty
weak, according to Bakshys study. Even if you consider the most lax definition of a strong tie
someone from whom youve received a single message or commentmost people still have a lot
more weak ties than strong ones. And this means that, when considered in aggregate, our weak
tieswith their access to novel informationare the most influential people in our networks.
Even though were more likely to share any one thing posted by a close friend, we have so many
more mere acquaintances posting stuff that our close friends are all but drowned out.
In this way, Bakshys findings complicate the echo chamber theory. If most of the people we
encounter online are weak ties rather than close friends, and if theyre all feeding us links that
we wouldnt have seen elsewhere, this suggests that Facebook (and the Web generally) isnt
simply confirming our view of the world. Social networkseven if theyre dominated by
personalization algorithms like EdgeRankcould be breaking you out of your filter bubble rather
than reinforcing it.
Bakshys work shares some features with previous communications studies on networks, and it
confirms some long-held ideas in sociology. (For instance, the idea that weak ties can be
important was first floated in a seminal 1973 study by Mark Granovetter.) It also confirms a few
other recent studies questioning the echo chamber, including the economists Matthew Gentzkow
and Jesse Shapiros look at online news segregation.
But there are two reasons why Bakshys research should be considered a landmark. First, the
study is experimental and not merely observational. Bakshy wasnt just watching how people
react to news shared by their friends on Facebook. Instead, he was able to actively game the
News Feed to create two different worlds in which some people get a certain piece of news and
other, statistically identical, people do not get that news. In this way, his study is like a clinical
trial: Theres a treatment group thats subjected to a certain stimulus and a control group that is
not, and Bakshy calculated the differences between the two. This allows him to draw causal
relationships between seeing a link and acting on it: If you see a link and reshare it while some
other user does not see the link and does not share it, this means that the Facebook feed was
responsible for the sharing.
The other crucial thing about this study is that it is almost unthinkably enormous. At the time of
the experiment, there were 500 million active users on Facebook. Bakshys experiment included
253 million of them and more than 75 million shared URLs, meaning that in total, the study
observed nearly 1.2 billion instances in which someone was or was not presented with a certain
link. This scale is unheard of in academic sociological studies, which usually involve hundreds or,
at most, thousands of people communicating in ways that are far less trackable.
At the same time, theres an obvious problem with Bakshys study: It could only occur with the
express consent of Facebook, and in the end it produced a result that is clearly very positive for
the social network. The fact that Facebooks P.R. team contacted me about the study and allowed
me to interview Bakshy suggests the company is very pleased with the result. If Bakshys
experiment had come to the opposite conclusionthat, say, the News Feed does seem to echo
our own ideasI suspect they wouldnt be publicizing it at all. (Bakshy told me that he has a
good amount of freedom at the company to research whatever he wants to look into about the
social network, and that no one tells him what to investigate and what to leave alone. The study
is being submitted to peer-reviewed academic journals.)
Also, so as not to completely tank the ongoing sales of my brilliant book, Id argue that Bakshys
study doesnt indemnify the modern media against other charges that its distorting our politics.
For one thing, while it shows that our weak ties give us access to stories that we wouldnt
otherwise have seen, it doesnt address whether those stories differ ideologically from our own
general worldview. If youre a liberal but you dont have time to follow political news very closely,
then your weak ties may just be showing you lefty blog links that you agree witheven though,
under Bakshys study, those links would have qualified as novel information. (Bakshys study
covered all links, not just links to news stories; he is currently working on a follow-up that is more
narrowly focused on political content.)

Whats more, even if social networks arent pushing us toward news that confirms our beliefs,
theres still the question of how we interpret that news. Even if were all being exposed to a
diverse range of stories, we can still decide whose spin we wantand then we go to the Drudge
Report or the Huffington Post to get our own views confirmed.
Still, I have to say Im gratified by Bakshys study. The echo chamber is one of many ideas about
the Web that weve come to accept in the absence of any firm evidence. The troves of data that
companies like Facebook are now collecting will help add some empirical backing to our
understanding of how we behave online. If some long-held beliefs get overturned in the process,
then all the better.

What John Dewey Knew


Eli Pariser, The Filter Bubble: What the Internet is Hiding From You (2011)
Everything which bars freedom and fullness of communication sets up barriers that divide
human beings into sets and cliques, into antagonistic sects and factions, and thereby
undermines the democratic way of life.
John Dewey
Jane Jacobs and filtering
Cass Sunstein, Republic.com 2.0 (2008)
Let me now disclose a central inspiration for this book, one that might seem far afield: The Death
and Life of Great American Cities, by Jane Jacobs. Among many other things, Jacobs offers an
elaborate tribute to the sheer diversity of citiesto public spaces in which visitors encounter a
range of people and practices that they could have barely imagined and that they could not
possibly have chosen in advance. As Jacobs describes great cities, they teem and pulsate with
life. It is possible to be on excellent sidewalk terms with people who are very different from
oneself and even, as time passes, on familiar public terms with them. Such relationships can,
and do, endure for many years, for decades. The tolerance, the room for great differences
among neighborsdifferences that often go far deeper than differences in colorwhich are
possible and normal in intensely urban lifeare possible and normal only when streets of great
cities have built-in equipment allowing strangers to dwell in peace together. Lowly,
unpurposeful and random as they may appear, sidewalk contacts are the small change from
which a citys wealth of public life may grow.
Jacobss book is about architecture, not communications. But with extraordinary vividness,
Jacobs helps show, through an examination of city architecture, why we should be concerned
about a situation in which people are able to create communications universes of their own
liking. Jacobss sidewalk contacts need not occur only on sidewalks. The idea of architecture
should be taken broadly, not narrowly. And acknowledging the benefits that Jacobs finds on
sidewalks, we might seek to find those benefits in many other places. At its best, I believe, a
system of communications can be, for many of us, a close cousin or counterpart to a great urban
center. For a healthy democracy, shared public spaces, virtual or not, are a lot better than echo
chambers.

Enclave extremism
Cass Sunstein, Republic.com 2.0 (2008)
Let us explore an experiment conducted in Colorado in 2005, designed to cast light on the
consequences of self- sorting. About 60 Americans were brought together and assembled into a
number of groups, each consisting of five or six people. Members of each group were asked to
deliberate on three of the most controversial issues of the day: Should states allow same-sex
couples to enter into civil unions? Should employers engage in affirmative action by giving a
preference to members of traditionally disadvantaged groups? Should the United States sign an
international treaty to combat global warming?
As the experiment was designed, the groups consisted of "liberal" and "conservative" enclaves
the former from Boulder, the latter from Colorado Springs. It is widely known that Boulder
tends to be liberal, and Colorado Springs tends to be conservative. Participants were screened to
ensure that they generally conformed to those stereotypes. People were asked to state their
opinions anonymously both before and after 15 minutes of group discussion. What was the effect
of that discussion?
In almost every case, people held more-extreme positions after they spoke with like-minded
others. Discussion made civil unions more popular among liberals and less popular among
conservatives. Liberals favored an international treaty to control global warming before
discussion; they favored it far more strongly after discussion. Conservatives were neutral on that
treaty before discussion, but they strongly opposed it after discussion. Liberals, mildly favorable
toward affirmative action before discussion, became strongly favorable toward affirmative action

after discussion. Firmly negative about affirmative action before discussion, conservatives
became fiercely negative about affirmative action after discussion.
The creation of enclaves of like-minded people had a second effect: It made both liberal groups
and conservative groups significantly more homogeneous and thus squelched diversity. Before
people started to talk, many groups displayed a fair amount of internal disagreement on the
three issues. The disagreements were greatly reduced as a result of a mere 15-minute
discussion. In their anonymous statements, group members showed far more consensus after
discussion than before. The discussion greatly widened the rift between liberals and
conservatives on all three issues. The Internet makes it exceedingly easy for people to replicate
the Colorado experiment online, whether or not that is what they are trying to do. Those who
think that affirmative action is a good idea can, and often do, read reams of material that
support their view; they can, and often do, exclude any and all material that argues the other
way. Those who dislike carbon taxes can find arguments to that effect. Many liberals jump from
one liberal blog to another, and many conservatives restrict their reading to points of view that
they find congenial. In short, those who want to find support for what they already think, and to
insulate themselves from disturbing topics and contrary points of view, can do that far more
easily than they can if they skim through a decent newspaper.
A key consequence of this kind of self-sorting is what we might call enclave extremism. When
people end up in enclaves of like-minded people, they usually move toward a more extreme
point in the direction to which the group's members were originally inclined. Enclave extremism
is a special case of the broader phenomenon of group polarization, which extends well beyond
politics and occurs as groups adopt a more extreme version of whatever view is antecedently
favored by their members.
Why do enclaves produce polarization?
Why do enclaves, on the Internet and elsewhere, produce political polarization? The first
explanation emphasizes the role of information. Suppose that people who tend to oppose nuclear
power are exposed to the views of those who agree with them. It stands to reason that such
people will find a disproportionately large number of arguments against nuclear power and a
disproportionately small number of arguments in favor of nuclear power. If people are paying
attention to one another, the exchange of information should move people further in opposition
to nuclear power. This very process was specifically observed in the Colorado experiment, and in
our increasingly enclaved world, it is happening every minute of every day.
The second explanation, involving social comparison, begins with the reasonable
suggestion that people want to be perceived favorably by other group members. Once they hear
what others believe, they often adjust their positions in the direction of the dominant position.
Suppose, for example, that people in an Internet discussion group tend to be sharply opposed to
the idea of civil unions for same-sex couples, and that they also want to seem to be sharply
opposed to such unions. If they are speaking with people who are also sharply opposed to these
things, they are likely to shift in the direction of even sharper opposition as a result of learning
what others think.
The final explanation is the most subtle, and probably the most important. The starting point
here is that on many issues, most of us are really not sure what we think. Our lack of certainty
inclines us toward the middle. Outside of enclaves, moderation is the usual path. Now imagine
that people find themselves in enclaves in which they exclusively hear from others who think as
they do. As a result, their confidence typically grows, and they become more extreme in their
beliefs. Corroboration, in short, reduces tentativeness, and an increase in confidence produces
extremism. Enclave extremism is particularly likely to occur on the Internet because people can
so easily find niches of like-minded types and discover that their own tentative view is shared
by others.
Cass Sunstein, Chronicle of Higher Education (2007)

Eli Pariser Medium responseSunday Story #9


Due Sun Nov 15 (before midnight)
Prompt

GAFA questions - provide structure and/or groupwork

Homework for Tue Nov 17 (class 16)


Who will dominate the digital economy?
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: GAFA business: past, present and future
Video
Watch the assigned videos on Zaption and post a one-sentence response to each.
Warren Buffet on technology replacing jobs (3:26)
http://zapt.io/tf28bc8w
Peter Thiel on GAFA (3:16)
http://zapt.io/t8t4jb26
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Who will dominate in the digital economy? (18 GAFA questions)
Break into Canvas groups

Questions:

What is Amazon Web Services and why is this business important to Amazon?

Will Amazon ever become profitable?

Which company (Google/Apple/Amazon/Facebook) is best positioned to use data acquired from


its users to personalize advertising? Why?

Which company (Google/Apple/Amazon/Facebook) is best positioned to use data acquired from


its users to predict what youd like to buy next?

Which company (Google/Apple/Amazon/Facebook) is best positioned to succeed in the mobile


world (daily life that is enabled by, and comes to depend on, smartphones, tablets, and other small,
mobile, easy-to-use computers)?

Explain: Each of the Fab Four (Google/Apple/Amazon/Facebook) believes that it can somehow
define the future of television. The honey pot? Not only that $70 billion in domestic ad revenue but also
$74 billion in cable-subscriber fees.

Which of the Fab Four is best positioned to dominate the distribution of television and movies?
Why?

One industry stands directly between the Fab Four (Google/Apple/Amazon/Facebook) and global
domination. It's an industry that frustrates you every day, one that consistently ranks at the bottom of
consumer satisfaction surveys, that poster child for stifling innovation and creativity: your phone carrier.
And your cable or DSL firm. For Amazon, Apple, Facebook, and Google, the world's wireless and
broadband companies are a blessing and a curse. By investing in the infrastructure that powers the
Internet, they've made the four firms' services possible. But the telcos and cable companies are also
gatekeepers to customers, and Amazon, Apple, Google, and Facebook would love to cut them out of the
equation. In the long run, they actually stand a shot at doing so. Research and then explain the
significance of this passage.

One of the technologies that Google released in the past few years anonymously tracks where its
Android phones go. So if you have a phone that is powered by Google's operating system, when you're
driving down the road, the phone might send back data to Google about how fast you're going, where
you are and various other statistics about your drive. And then Google can collect all of that information
from all the Android phones, and it can create a very accurate representation of traffic patterns in a city.
Research and then explain the significance of this passage.
Data powers new inventions: Google's voice-recognition system, its traffic maps, and its spell-checker
are all based on large-scale, anonymous customer tracking. Research and then explain the significance
of this passage.
Facebook is not a bystander in the competition to create the best possible digital shopping experience
for consumersanother battle for which those platforms are being built. To win this one means taking
turf from Amazon. Facebook Gifts is a new service in America which mines what the company knows
about its users, their tastes and their friendships to encourage them to buy and send each other gifts at
appropriate times, such as birthdays. Will Facebook Gifts ever become a meaningfully sized business?
Read this story and explain its significance:

http://qz.com/83243/amazon-apple-google-facebook-are-all-trying-to-turn-into-the-same-ubercompany/

Which company (Apple/Amazon/Facebook) is most likely to challenge Googles dominance in web


search? (Read the following passage.)

The other outfit standing between you and the Fab Four (Google/Apple/Amazon/Facebook) is one
that barely registers: your credit-card company. When you buy something through iTunes, the Android
Market, Amazon, or Facebook, the credit-card company gets a small cut of your payment. To these

giants, the cut represents a terrible inefficiency--why surrender all that cash to an interloper? And not
just any interloper, but an inefficient, unfriendly one that rarely innovates for its consumers. These
credit-card giants seem ripe for the picking. Heres how that scenario would play out. The first step is
getting consumers used to the idea of paying by phone. The second step is to encourage consumers to
link their bank accounts directly to their devices, thus eliminating the credit-card middleman. Research
and then explain the significance of this passage.
Review the organizational charts graphic in this story and tell us what you think it means.
http://www.ritholtz.com/blog/2013/07/organizational-charts-of-amazon-apple-facebook-microsoft/ (Links
to an external site.)
Platforms are the weapons with which the warring factions seek to rule their own lands and conquer
new ones. Patents are the weapons with which they try straightforwardly to hurt their rivals. Although
some lawsuits have been launched by trolls who accumulate patents without actually making stuff, a
number have been launched by one giant, or a company acting as its catspaw, against one of the
others. Apple has been lobbing lawsuits around in the smartphone arena as if armed with a trebuchet.
Google snapped up Motorola Mobility in large part to get its hands on the firms thousands of patents
issued and pending, thus bulking up its own defences and accumulating ammunition to fling at the
fortresses of the competition. Research and then explain the significance of this passage (clue: the point
here is how patents create power for these companies vis--vis each other).
Google is experimenting with a service that would let folk find goods online, order them and have
them delivered within a day for a modest fee. This seems similar to Amazons hugely successful Prime
service, which costs $79 a year to join in America. Rather than try to replicate the e-commerce giants
extensive network of warehouses, Google is looking for partnerships with shipping companies and
retailers instead. But if it is serious about taking on Amazon, it may ultimately have to buy a logistics
firm. At $69 billion UPS has a market value less than a third of Googles; it is valued at less than twice
the search giants cash pile. Research and then explain the significance of this passage.

Review the infographic in this story and share what you find compelling in it.
http://venturebeat.com/2013/06/25/every-day-tracking/
Write and respond to a lesson-relevant question of your choosing. z

iTunes and the Economics of Music Distribution


Scott Cendrowski, fortune.cnn.com (2011)

The tech boom of the 1990s was thought to spell the death of plenty of brick-and-mortar
companies, but they coexisted with their e-rivals for years. It looks like those days are now
coming to an end.
We all knew that the 1990s tech boom would change the world. But then a funny thing
happened: For years brick-and-mortar companies happily coexisted with their e-rivals. Borders,
for instance, actually increased sales from 2000 to 2005 as it dueled Amazon (AMZN). Now those
days seem to be ending. Digital companies are so big, and growing so fast, that they're
obliterating old businesses. Consider these four examples: The U.S. Postal Service says it will be
insolvent by the end of 2011 without a bailout. Blockbuster and Borders have filed for
bankruptcy. And music stores keep closing.
Texts vs. Mail. The U.S. Postal Service is on track to lose $6 billion this year, as e-mails and
texting reduce mail volumes faster than postage fees can rise.
Netflix vs. Blockbuster. Blockbuster hit 4,000 stores in two decades. Then, in 1997, Reed
Hastings got charged a $40 late fee on Apollo 13 and founded Netflix (NFLX). The rest is history.
Amazon vs. Borders. Amazon almost single-handedly bankrupted the No. 2 bookseller in a
decade. Barnes & Noble (BKS) is fighting back with its Nook.
iTunes vs. CDs. ITunes made its debut in 2003, with devastating effects on music retailers.
Tower Records went bust in 2004. Musicland folded in 2006. FYE has shriveled.
Coming Soon: The End of Movie Theatres?
Andrew Keen, The Cult of the Amateur (2008)
The Internet is beginning to undermine the viability of the movie theater. ClickStar, an Intelfunded start-up founded by actor Morgan Freeman and launched in December 2006, is debuting
some independent films on the Internet the same day they are released in the theaters. Such
practices, which go against long-held Hollywood strategy, will compound the crisis facing movie
theaters. When a movie is available on the Internet as soon as it has been released, why go to
the extra inconvenience and cost of seeing it in a local theater? For many technophiles
accustomed to watching all media on their computers already, the big screen viewing experience
of the multiplex will hardly be missed.
Walled gardens
Tim Berners-Lee, Scientific American (2010)
In contrast, not using open standards creates closed worlds. Apples iTunes system, for example,
identifies songs and videos using URIs that are open. But instead of http: theq addresses begin
with itunes:, which is proprietary. You can access an itunes: link only using Apples
proprietary iTunes program. You cant make a link to any information in the iTunes worlda song
or information about a band. You cant send that link to someone else to see. You are no longer
on the Web. The iTunes world is centralized and walled off. You are trapped in a single store,
rather than being on the open marketplace. For all the stores wonderful features, its evolution is
limited to what one company thinks up. Other companies are also creating closed worlds. The
tendency for magazines, for example, to produce smartphone apps rather than Web apps is
disturbing, because that material is off the Web. You cant bookmark it or e-mail a link to a page
within it. You cant tweet it. It is better to build a Web app that will also run on smartphone
browsers, and the techniques for doing so are getting better all the time.
The web as we know it is being threatened
Tim Berners-Lee, Scientific American (2010)
The Web as we know it, however, is being threatened in different ways. Some of its most
successful inhabitants have begun to chip away at its principles. Large social-networking sites
are walling off information posted by their users from the rest of the Web. Wireless Internet
providers are being tempted to slow traffic to sites with which they have not made deals.
Governmentstotalitarian and democratic alikeare monitoring peoples online habits,
endangering important human rights. Several threats to the Webs universality have arisen

recently. Cable television companies that sell Internet connectivity are considering whether to
limit their Internet users to downloading only the companys mix of entertainment. Socialnetworking sites present a different kind of problem. Facebook, LinkedIn, Friendster and others
typically provide value by capturing information as you enter it: your birthday, your e-mail
address, your likes, and links indicating who is friends with whom and who is in which
photograph. The sites assemble these bits of data into brilliant databases and reuse the
information to provide value-added servicebut only within their sites. Once you enter your data
into one of these services, you cannot easily use them on another site. Each site is a silo, walled
off from the others. Yes, your sites pages are on the Web, but your data are not.

In-class resources for Tue Nov 17 (class 16)


Who will dominate the digital economy?
Digerati video and reading
(discussion leader: x)
Digerati
Alexis Ohanian
@alexisohanian
https://en.wikipedia.org/wiki/Alexis_Ohanian
Video
We will watch and discuss this in class.
Alexis Ohanian on Colbert Report (6:50)
http://thecolbertreport.cc.com/videos/wrbvsm/alexis-ohanian
Reading
We will read and discuss this in class.
Without Their Permission: How the 21st Century Will Be Made, Not Managed
(Alexis Ohanian, 2013)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Google growth
Apple growth
Facebook growth
Amazon growth
GAFA competition
The internet of things (IoT)

Without Their Permission: How the 21st Century Will Be Made, Not Managed

Homework for Thu Nov 19 (class 17)


The dangers of GAFA
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: The dangers of GAFA
Video
Watch the assigned video on Zaption and post a one-sentence response.
The Google Master Plan (3:12)
http://zapt.io/tbrek9s7
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to the prompt on Canvas:
Is Google evil? (expand to GAFA)
Stay in groups?

Lock-in
Eli Pariser, The Filter Bubble: What the Internet is Hiding From You (2011)
Which brings us to lock-in. Lock-in is the point at which users are so invested in their technology
that even if competitors might offer better services, its not worth making the switch. If youre a
Facebook member, think about what itd take to get you to switch to another social networking
siteeven if the site had vastly greater features. Itd probably take a lotre-creating your whole
profile, uploading all of those pictures, and laboriously entering your friends names would be
extremely tedious. Youre pretty locked in. Likewise, Gmail, Google Play, Google Drive, and a host
of other products are part of an orchestrated campaign for Google lock-in. The fight between
Google and Facebook hinges on which can achieve lock-in for the most users.
Google is big
Scott Cleland, Search & Destroy: Why You Cant Trust Google, Inc. (2011)
Google has more than one billion users. Google dominates both search and search advertising.
Google handles more than two billion Internet searches per day. Googles tentacles extend to
every major type of content, every major hardware platform, every nook and cranny of the Web,
and every corner of the globe. Google has indexed over one trillion Web pages. If you spent one
minute scanning each page indexed by Google, then you would need more than 38,000 years to
scan them all. Gmails data repository is equivalent to about 1.74 billion music CDs. If you
printed the information Google processes each day, then you would need to cut down 1.2 million
trees.
A list of information that Google collects
Scott Cleland, Search & Destroy: Why You Cant Trust Google, Inc. (2011)

Your interests, desires and needs (e.g., Search) Your search history (e.g. Web History)
The websites you visit (e.g. Chrome)
The videos that you watch (e.g. YouTube)
The news, commentary, and books that you reach (e.g. Google Books) The topics that you
discuss (e.g. Google Groups)
The content that you produce (e.g. Gmail)
Your, your familys and your friends faces (e.g. Picasa)
The sound of your voice and the people you call (e.g. Google Talk) Your medical history and
prescriptions (e.g. Google Health)
Your purchases (e.g. Google Maps)
Your locations of interest (e.g. Google Street View) Your personal information (e.g. Checkout)
Your home, workplace, and hangouts (e.g. Google Latitude) Your activity plans (e.g. Google
Calendar)
The data stores on your computer (e.g. Google Desktop with Search across computers
enabled) The TV Programs you watch (e.g. Google TV)
This is by no means a complete list there are hundreds of Google Products. Google also offers
tools that enable application developers to gather information and invests in companies
developing new sources of information.
For example, Google has invested in 23andMe, a company that is helping individuals
understand their own
genetic information using recent advances in DNA analysis technologies and web-based
interactive tools. Personal genomes could be the ultimate tool for targeted advertising,
personalization, and even medical fortune telling.
YouTube, the tracking tool
Scott Cleland, Search & Destroy: Why You Cant Trust Google, Inc. (2011)
YouTube is one of Googles most powerful tracking tools. Acquired by Google in 2006 for $1.65
billion (an astounding figure given that YouTube did not have a viable business model at the
time). YouTube is the leading video-sharing website. YouTube makes it easy for blogs and other
websites to embed YouTube videos in their Web pages. As weve seen, Google is informed every

time your browser loads a Web page with embedded YouTube videos. Google not only keeps a
log of the YouTube videos you watch, it associates the log with your true identity.
Hmm...
Scott Cleland, Search & Destroy: Why You Cant Trust Google, Inc. (2011)
Google has patented a system for monitoring the way you move the on-screen pointer with your
PCs mouse. Its also been reported that Google is developing a method for listening to
background sounds picked up by your PCs microphone.
Gmail Privacy Concerns
Scott Cleland, Search & Destroy: Why You Cant Trust Google, Inc. (2011)
Gmail scans all of your email both the email you compose and send to others and the email
that others compose and send to you. By subscribing to Gmail, you are giving Google permission
to scan email you receive from people who are not Gmail subscribers, who have not given
Google permission to scan and store their email, and who might be horrified if they knew that
their messages were being scanned and permanently stored by a third party. Less well known is
the risk posed by a Gmail feature called auto-save. Many desktop programs, including email
programs, automatically save draft documents as you compost them. This comes in handy if the
program crashed or your PC loses power before you save your most recent work. While desktop
application typically save drafts to your PC, auto-save sends draft messages over the Internet
and saves them on Googles servers. If you compose a message out of anger, and later decide to
replace it with a calmer message, Google may retain a copy of the embarrassing draft.
Google and the NSA
Scott Cleland, Search & Destroy: Why You Cant Trust Google, Inc. (2011)
More recently, Google began collaborating with the National Security Agency (NSA) for the
ostensible purpose of thwarting cyberattacks. The NSA is chartered to gather intelligence from
foreign communications. However, this often involves monitoring communications between
people in foreign countries and people in the U.S. While cyberattacks on the U.S information
infrastructure are a legitimate national security concern, collaboration between the worlds
biggest commercial data mining operation and the NSA presents myriad opportunities for abuse.
What happens when Google changes its mind?
Siva Vaidhyanathan, The Googlization of Everything (2011)
The main risk of the privatization of book content is simple: libraries and universities last, but
companies wither and fail. Should we entrust our heritage and collective knowledge to a
company that has been around for less than fifteen years? What will happen if stockholders
decide that Google Books is a money loser or too much of a liability? What if they decide that the
infrastructure costs of keeping all those files on all those servers are not justifiable?
Is Google too big to fail?
Stephen Gandal, TIME magazine (2011)
Another question: Is Google too big to fail? The government would probably weigh the impact on
Web business and the economy in general if Google were to be broken up.
Google's Monopoly
Stephen Gandal, TIME magazine (2011)
It is clear that Google has a tight hold on the Internet. And that has been clearer than ever in the
past year. The New York Times has run a number of stories showing how companies have been
able to boost their sales by gaming Google's search algorithms. When Google changed its
algorithm, those sites fell off the search page, basically hiding them from the world. The stories
also showed what lengths companies will go not to cross Google and what they will do to try
to make up. If that's not a sign of monopoly, I don't know what is. And that's just the search side
of the business. Google's power on the advertising side is even more troublesome. If you are a

small company and Google won't take your advertisement, you basically can't market your
wares on the Web, at least not to any sizable audience.
Nothing is free in this world
Siva Vaidhyanathan, The Googlization of Everything (and why we should worry) (2011)
One of the great attractions of Google is that it appears to offer so many powerful services for
freethat is, for no remuneration. But there is an implicit nonmonetary transaction between
Google and its users. Google gives us Web search, e-mail, Blogger platforms, and YouTube
videos. In return, Google gets information about our habits and predilections so that it can more
efficiently target advertisements at us. Googles core business is consumer profiling. It generates
dossiers on many of us. It stores cookies in our Web browsers to track our clicks and
curiosities. Yet we have no idea how substantial or accurate these digital portraits are. This book
generates a fuller picture of what is at stake in this apparently costless transaction and a new
account of surveillance that goes beyond the now-trite Panopticon model.
Gmail Privacy Concerns
Scott Cleland, Search & Destroy: Why You Cant Trust Google, Inc. (2011)
Gmail scans all of your email both the email you compose and send to others and the email
that others compose and send to you. By subscribing to Gmail, you are giving Google permission
to scan email you receive from people who are not Gmail subscribers, who have not given
Google permission to scan and store their email, and who might be horrified if they knew that
their messages were being scanned and permanently stored by a third party.
Thoughts on asymmetrical power
Eli Pariser, The Filter Bubble: What the Internet is Hiding From You (2011)
One of the defining traits of the new personal information environment is that its asymmetrical.
As Jonathan Zittrain argues in The Future of the Internetand How to Stop It, nowadays, an
individual must increasingly give information about himself to large and relatively faceless
institutions, for handling and use by strangersunknown, unseen, and all too frequently,
unresponsive.
In a small town or an apartment building with paper-thin walls, what I know about you is roughly
the same as what you know about me. Thats a basis for a social contract, in which well
deliberately ignore some of what we know. The new privacyless world does away with that
contract. I can know a lot about you without your knowing I know. Theres an implicit bargain in
our behavior, search expert John Battelle told me, that we havent done the math on.
If Sir Francis Bacon is right that knowledge is power, privacy proponent Viktor MayerSchonberger writes that what were witnessing now is nothing less than a redistribution of
information power from the powerless to the powerful. Itd be one thing if we all knew
everything about each other. Its another when centralized entities know a lot more about us
than we know about each otherand sometimes, more than we know about ourselves. If
knowledge is power, then asymmetries in knowledge are asymmetries in power.
Googles famous Dont be evil motto is presumably intended to allay some of these concerns. I
once explained to a Google search engineer that while I didnt think the company was currently
evil, it seemed to have at its fingertips everything it needed to do evil if it wished. He smiled
broadly. Right, he said. Were not evil. We try really hard not to be evil. But if we wanted to,
man, could we ever!

In-class resources for Thu Nov 19 (class 17)


The dangers of GAFA
Digerati video and reading
(discussion leader: x)
Digerati
Larry Page and Sergei Brin
http://www.google.com/about/company/facts/management/
Video
We will watch and discuss this in class.
Larry Page TED Talk interview with Charlie Rose (23:00-edit)
https://www.ted.com/talks/larry_page_where_s_google_going_next?language=en
Reading
We will read and discuss this in class.
The Google Chronicles: 7 Facts on Founders Larry Page & Sergey Brin
(Sara Kettler, biography.com, 2014)
http://www.biography.com/news/google-founders-history-facts
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Google evil
Apple evil
Facebook evil
Amazon evil
Uber evil
GAFA evil
How technology is changing the way we work

The Google Chronicles: 7 Facts on Founders Larry Page & Sergey Brin

Sunday Story #10


Due Sun Nov 22 (before midnight)
Prompt
Medium gender race tbd

Homework for Tue Nov 24 (class 18)


Women and Technology
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Lean In: Women, Work, and the Will to Lead
(Sheryl Sandberg, 2014)
Excerpts: Technology, Gender and Race
Video
Watch the assigned videos on Zaption and post a one-sentence response to each.
Sheryl Sandberg (52:00)watch at least 8 minutes of this video
http://zapt.io/t9kdkkme
Sheryl Sandberg shares grief (2:49)
http://zapt.io/t2mrkvyx
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet

Lean In: Women, Work and the Will to Lead - excerpts


The imminent death of the internet troll
The Atlantic (2014)
In a Pew Research Center survey of 2,849 Internet users, one out of every four women between
18 years old and 24 years old reports having been stalked or sexually harassed online. Two out of
five people said they'd been victims of some form of online harassment. And nearly threequarters of responders said they'd witnessed harassment online. Put simply, online life, from the
earliest days of dialing into a BBS to new services like Yik Yak, has been marked by significant
hatred and harassment of fellow users. And if you happen to be a female user of these services,
it can be much, much worse.
Pew attempted to measure two broad but overlapping types of harassment: less severe
harassment like name-calling and attempts to embarrass users, versus more severe forms like
physical threats, harassment over a long and sustained period of time, stalking, and sexual
harassment. The findings evoke the axiom, widely attributed to Margaret Atwood: "Men are
afraid that women will laugh at them. Women are afraid that men will kill them." Men, on the
whole, report higher rates of less severe types of harassment (with the exception of physical
threats), while women are more likely to be the focus of the two most frightening forms of it:
sexual harassment and stalking.
Men and Women Experience Different Varieties of Online Harassment (%)

We are in the early days of online harassment being taken as a serious problem, and not simply
a quirk of online life. The likely solution will be a combination of things. The expansion of laws
like the one currently on the books in California, which expands what constitutes online
harassment, could help put the pressure on harassers. The upcoming Supreme Court case, Elonis
v. The United States, looks to test the limits of free speech versus threatening comments on
Facebook. But there are limits to legal action. "Law can only do so much," says Citron. "Its a
blunt instrument." Which leaves societal pressure. Sexual harassment in the workplace has been
greatly reduced not just because employers are suddenly liablethere's also a huge social
stigma against those who sexually harass their co-workers. It's difficult to see, in 2014, how
exactly to stop an anonymous person sitting behind a keyboard from making the lives of others
miserable if they so choose. Can a combination of legal action, market pressure, and societal
taboo work together to curb harassment? Too many people do too much online for things to stay
the way they are.

Courseness and vulgarity, by default

Christine Rosen, The New Atlantis (2007)


A kind of coarseness and vulgarity is commonplace on social networking sites for a reason: its
an easy way to set oneself apart. Pharaohs and kings once celebrated themselves by erecting
towering statues or, like the emperor Augustus, placing their own visages on coins. But now, as
the insightful technology observer Jaron Lanier has written, Since there are only a few
archetypes, ideals, or icons to strive for in comparison to the vastness of instances of everything
online, quirks and idiosyncrasies stand out better than grandeur in this new domain. I imagine
Augustus MySpace page would have pictured him picking his nose.

In-class resources for Tue Nov 24 (class 18)


Lean In: Women, Work and the Will to Lead
Digerati video and reading
(discussion leader: x)
Digerati
Sheryl Sandberg
@sherylsandberg
https://en.wikipedia.org/wiki/Sheryl_Sandberg
Video
We will watch and discuss this in class.
Sheryl Sandberg on The Daily Show (6:32)
http://thedailyshow.cc.com/videos/f1mqki/sheryl-sandberg
Reading
We will read and discuss this in class.
Lean In: Women, Work, and the Will to Lead
(Sheryl Sandberg, 2013)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Technology, gender and race

No Sunday Story for Sun Nov 29


Happy Thanksgiving!

Homework for Tue Dec 1 (class 19)


Her
No homework for Tue Dec 1: Happy Thanksgiving!

In-class resources for Tue Dec 1 (class 19)


Her
Links:
(on Canvas lesson page)
her
Read the article: The philosophy of her by Susan Schneider.
Schneider writes:
Her raises a question that has long preoccupied philosophers.
Will we humans might one day be able to upload our own minds to computers,
untethered from a body thats inevitably going to die?
Please respond to this prompt for online discussion:
If this technology were to exist in your lifetime, would you choose to create a
digital copy of yourself?
Read the article: The other Singularity by Christine Rosen.
Please respond online to this question which Rosen asks in her article:
Could a perfectly-calibrated machine intelligence make you happier than another
person?

Homework for Thu Dec 3 (class 20)


Why we expect more from technology and less from each other
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: Why we expect more from technology and less from each other
Video
Watch the assigned video on Zaption and post a one-sentence response.
Scarlett Johannsson on Daily Show (7:00)
http://zapt.io/tvder2sj
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
How does social media affect romance? (relate to her?)

How is social media affecting our relationships with family?


Postfamilial families
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
In the evening, when sensibilities such as these come together, they are likely to form what have
been called postfamilial families. Their members are alone together, each in their own rooms,
each on a networked computer or mobile device. We go online because we are busy but end up
spending more time with technology and less with each other. We defend connectivity as a way
to be close, even as we effectively hide from each other. At the limit, we will settle for the
inanimate, if thats what it takes.
Does Facebook make us withdraw from our friends and family?
John Freeman, The Tyranny of Email (2009)
The most glaring discovery of the Stanford University study was not that people burned up two
hours a day on the Internet but that those two hours came out of time they would normally
spend with family and friends. Once that withdrawal has begun and technology has been
identified as a way to connect, its a hard cycle to break. We blog, broadcast our vacations on
YouTube, obsessively update the newsfeeds of our Facebook pagesToday, Brian is feeling
happy as if an experience, an emotion, a task completed hasnt actually happened unless it
has been recorded and shared with others. Facebook is the biggest, broadest highway on which
this outward projection occurs. Why write a postcard about your trip to France to one friend when
you can simply share the message with all your friends?
If we spend our evening online trading short messages over Facebook with friends thousands of
miles away rather than going to our local pub or park with a friend, we are effectively
withdrawing from the people we could turn to for solace, humor, and friendship, not to mention
the places we could go to do this. We trade the complicated reality of friendship for its vacuumpacked idea. We exchange the real sensual pleasure of sharing a meal or going for a walk
activities that sustain the tangible commonsfor the disembodied excitement of being talked to
or heard online. Sitting at an outdoor caf and having a conversation, browsing for books with a
friend in a bookstore, we cannot help but confront the physical world and help maintain its
upkeep; chatting online, we can go hours without remembering we are looking into a screen.

Parents staring at screens


Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
One high school senior recalls a time when his father used to sit next to him on the couch,
reading. He read for pleasure and didnt mind being interrupted. But when his father, a doctor,
switched from books to his BlackBerry, things became less clear: He could be playing a game or
looking at a patient record, and you would never know.... He is in that same BlackBerry zone. It
takes work to bring his father out of that zone. When he emerges, he needs time to refocus. You
might ask him a question and hell say, Yeah, one second. And then hell finish typing his email or whatever, hell log off whatever, and hell say, Yeah, Im sorry, what did you say? It is
commonplace to hear children, from the age of eight through the teen years, describe the
frustration of trying to get the attention of their multitasking parents. Now, these same children
are insecure about having each others attention. At night, as they sit at computer screens, any
messages sent or received share mind space with shopping, uploading photos, updating
Facebook, watching videos, playing games, and doing homework. One high school senior
describes evening conversation at his machine: When Im IMing, I can be talking to three
different people at the same time and listening to music and also looking at a website. During
the day, prime time for phone texting, communications happen as teenagers are on their way
from one thing to another. Teenagers talk about what they are losing when they text: how
someone stands, the tone of their voice, the expression on their face, the things your eyes and
ears tell you, as one eighteen-year-old puts it.

Has social media made friendship less demanding?


On Facebook Friends
Jesse Rice, The Church of Facebook: How the Hyperconnected are Redefining Reality (2009)
A Facebook friendship really isnt too demanding of a relationship. Our Facebook connections
typically require little thought or action on our part. We dont have to work hard at them, or offer
much of ourselves in return. We dont have to take responsibility for anyone. We get to enjoy
glimpses into our friends lives both old and new without all that messy getting to know you
business. And perhaps most importantly to us, we get to reveal and withhold whatever we feel
like. We are in control. We do not answer to anything other than our own temporal wishes.
Online friendships can be fleeting
John Palfrey, Born Digital: Understanding the First Generation of Digital Natives (2008)
Online friendships are based on many of the same things as traditional friendshipsshared
interests, frequent interactionbut they nonetheless have a very different tenor: They are often
fleeting; they are easy to enter into and easy to leave, without so much as a goodbye; and they
are also perhaps enduring in ways we have yet to understand.
The Bureaucratization of Friendship
Christine Rosen, The New Atlantis (2007)
The structure of social networking sites also encourages the bureaucratization of friendship.
Each site has its own terminology, but among the words that users employ most often is
managing. The Pew survey mentioned earlier found that teens say social networking sites
help them manage their friendships. There is something Orwellian about the managementspeak on social networking sites: Change My Top Friends, View All of My Friends and, for
those times when our inner Stalins sense the need for a virtual purge, Edit Friends. With a few
mouse clicks one can elevate or downgrade (or entirely eliminate) a relationship.
"Friendships" with a minimum effort
Christine Rosen, The New Atlantis (2007)
Perhaps we should praise social networking websites for streamlining friendship the way e-mail
streamlined correspondence. In the nineteenth century, Emerson observed that friendship
requires more time than poor busy men can usually command. Now, technology has given us
the freedom to tap into our network of friends when it is convenient for us. Its a way of
maintaining a friendship without having to make any effort whatsoever, as a recent graduate of
Harvard explained to The New Yorker.
Where's Gina?
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
The popular urban high school senior has over five hundred followers on Twitter (most of them
real people) reading her every post to find out where the action is tonight. Im trailing her, along
with a youth culture trendspotter, to see what she does on a typical Friday night: how she makes
her decisions, and how she communicates them to her ever-growing posse of followers. Gina is a
trendsetter, a social leader, and a creature of the momentin more ways than one.
Shes at a club on the Upper East Side, but seems oblivious to the boys and the music. Instead of
engaging with those around her, shes scrolling through text messages on her phone, from
friends at other parties, bars, and clubs throughout New York. She needs to know if the event
shes at is the event to be at, or whether something better is happening at that very moment,
somewhere else. Sure enough, a blip on display catches her interest, and in what seems like
seconds were in a cab headed for the East Village.
We arrive at a seemingly identical party, but its the one that Gina has decided is the place to
be tonight. Instead of turning the phone o and enjoying herself, however, she turns her phone

around, activates the camera, and proceeds to take pictures of herself and her friendsinstantly
uploading them to her Facebook page for the world to see. She does this for about an hour, until
a message comes through one of her networks and shes o to the next location for the cycle to
begin all over again.
Gina is the girl who is everywhere at once, yet ultimatelynowhere at all. She is already
violating the first command by maintaining an always on relationship to her devices and
networks. This has in turn fostered her manic, compulsive need to keep tabs on everything
everyone else is doing at all times. It has not only removed her from linear time, however, but
also from physical place. She relates to her friends through the network, while practically
ignoring whomever she is with at the moment. She relates to the places and people she is
actually with only insofar as they are suitable for transmission to others in remote locations. The
most social girl in her class doesnt really socialize in the real world at all.
No such thing as full attention
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
Teenagers know that when they communicate by instant message, they compete with many
other windows on a computer screen. They know how little attention they are getting because
they know how little they give to the instant messages they receive. One sophomore girl at
Branscomb High School compares instant messaging to being on cruise control or automatic
pilot. Your attention is elsewhere. A Branscomb senior says, Even if I give my full attention to
the person I am IMing . . . they are not giving full attention to me. The first thing he does when
he makes a call is to gauge whether the person on the other end is there just for me. This is
one advantage of a call. When you text or instant-message, you have no way to tell how much
else is going on for the person writing you. He or she could also be on the phone, doing
homework, watching TV, or in the midst of other online conversations.
On stalking and showers
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
So, stalking is a transgression that does not transgress. A seventeen-year-old junior at the
Fillmore School describes it as the worst. Normal, but still creepy. Normal because its not
against the rules to look at peoples wall-to-wall conversations [on Facebook]. Creepy because
its like listening to a conversation that you are not in, and after stalking I feel like I need to take
a shower. Just starting college, Dawn, eighteen, says she is obsessed with the interesting
people who are her new classmates: I spend all night reading peoples walls. I track their
parties. I check out their girlfriends. She, too, says, My time on Facebook makes me feel dirty.
So stalking may not be breaking any rules, but it has given young people a way to invade each
others privacy that can make them feel like spies and pornographers.
Goodbye empathy
Brian Chen, Always On (2011)
In another recent study, based on surveys measuring empathy among almost fourteen thousand
college students over the last thirty years, University of Michigan researchers found that todays
college students are significantly less empathetic than college students of the 1980s and 1990s.
The researchers suggested that perhaps connecting with friends online makes shutting out realworld issues easier. The ease of having friends online might make people more likely to just
tune out when they dont feel like responding to others problems, a behavior that could carry
over offline, said Edward OBrien, a University of Michigan graduate student who helped with
the study.

How does social media affect romantic relationships?


What is intimacy without privacy?
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
Todays adolescents have no less need than those of previous generations to learn empathic
skills, to think about their values and identity, and to manage and express feelings. They need
time to discover themselves, time to think. But technology, put in the service of always-on
communication and telegraphic speed and brevity, has changed the rules of engagement with all
of this. When is downtime, when is stillness? The text-driven world of rapid response does not
make self-reflection impossible but does little to cultivate it. When interchanges are reformatted
for the small screen and reduced to the emotional shorthand of emoticons, there are necessary
simplifications. And what of adolescents need for secrets, for marking out what is theirs alone? I
wonder about this as I watch cell phones passed around high school cafeterias. Photos and
messages are being shared and compared. I cannot help but identify with the people who sent
the messages to these wandering phones. Do they all assume that their words and photographs
are on public display? Perhaps. Traditionally, the development of intimacy required privacy.
Intimacy without privacy reinvents what intimacy means.
Does virtual intimacy degrade other experiences?
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
As we instant-message, e-mail, text, and Twitter, technology redraws the boundaries between
intimacy and solitude. We talk of getting rid of our e-mails, as though these notes are so much
excess baggage. Teenagers avoid making telephone calls, fearful that they reveal too much.
They would rather text than talk. Adults, too, choose keyboards over the human voice. It is more
efficient, they say. Things that happen in real time take too much time. Tethered to technology,
we are shaken when that world unplugged does not signify, does not satisfy. After an evening
of avatar-to-avatar talk in a networked game, we feel, at one moment, in possession of a full
social life and, in the next, curiously isolated, in tenuous complicity with strangers. We build a
following on Facebook or MySpace and wonder to what degree our followers are friends. We
recreate ourselves as online personae and give ourselves new bodies, homes, jobs, and
romances. Yet, suddenly, in the half-light of virtual community, we may feel utterly alone. As we
distribute ourselves, we may abandon ourselves. Sometimes people experience no sense of
having communicated after hours of connection. And they report feelings of closeness when they
are paying little attention. In all of this, there is a nagging question: Does virtual intimacy
degrade our experience of the other kind and, indeed, of all encounters, of any kind?
Hating the phone
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other
(2011)
In corporations, among friends, and within academic departments, people readily admit that
they would rather leave a voicemail or send an e-mail than talk face-to-face. Some who say I
live my life on my BlackBerry are forthright about avoiding the real-time commitment of a
phone call. The new technologies allow us to dial down human contact, to titrate its nature and
extent. I recently overheard a conversation in a restaurant between two women. No one
answers the phone in our house anymore, the first woman proclaimed with some consternation.
It used to be that the kids would race to pick up the phone. Now they are up in their rooms,
knowing no one is going to call them, and texting and going on Facebook or whatever instead.
Parents with teenage children will be nodding at this very familiar story in recognition and
perhaps a sense of wonderment that this has happened, and so quickly. And teenagers will
simply be saying, Well, whats your point? A thirteen-year-old tells me she hates the phone
and never listens to voicemail. Texting offers just the right amount of access, just the right
amount of control. She is a modern Goldilocks: for her, texting puts people not too close, not too
far, but at just the right distance. The world is now full of modern Goldilockses, people who take
comfort in being in touch with a lot of people whom they also keep at bay. A twenty-one-year-old
college student reflects on the new balance: I dont use my phone for calls any more. I dont

have the time to just go on and on. I like texting, Twitter, looking at someones Facebook wall. I
learn what I need to know.
Technologies live in complex ecologies. The meaning of any one depends on what others are
available. The telephone was once a way to touch base or ask a simple question. But once you
have access to e-mail, instant
messaging, and texting, things change. Although we still use the phone to keep up with those
closest to us, we use it less outside this circle. Not only do people say that a phone call asks too
much, they worry it will be received as demanding too much. Randolph, a forty-six-year-old
architect with two jobs, two young children, and a twelve-year- old son from a former marriage,
makes both points. He avoids the telephone because he feels tapped out.... It promises more
than Im willing to deliver. If he keeps his communications to text and e-mail, he believes he can
keep it together. He explains, Now that there is e-mail, people expect that a call will be more
complicated. Not about facts. A fuller thing. People expect it to take timeor else you wouldnt
have called.
Me and my machine
William Powers, Hamlets Blackberry: Building a Good Life in the Digital Age (2010)
Educator and writer Lowell Monke shared with his students a troubling study that showed that
many young people prefer to interact with machines rather than directly with human beings. The
next day, one of the students sent him an e-mail explaining why this might be: I do feel deeply
disturbed when I can run errand after errand, and complete one task after another with the help
of bank clerks, cashiers, postal employees, and hairstylists without ANY eye contact at all! After
a wicked morning of that, I am ready to conduct all business online. In a society in which adults
so commonly treat each other mechanically, Monke writes, perhaps we shouldnt be surprised
that our youth are more attracted to machines. We believe in our screens so much, weve
placed them at the center of our lives, so why shouldnt they?
Giving up face-to-face
Christine Rosen, The New Atlantis (2007)
We should also take note of the trend toward giving up face-to-face for virtual contactand, in
some cases, a preference for the latter. Today, many of our cultural, social, and political
interactions take place through eminently convenient technological surrogatesWhy go to the
bank if you can use the ATM? Why browse in a bookstore when you can simply peruse the
personalized selections Amazon.com has made for you? These virtual networks greatly expand
our opportunities to meet others, but they might also result in our valuing less the capacity for
genuine connection. As the young woman writing in the Times admitted, I consistently trade
actual human contact for the more reliable high of smiles on MySpace, winks on Match.com, and
pokes on Facebook. That she finds these online relationships more reliable is telling: it shows a
desire to avoid the vulnerability and uncertainty that true friendship entails. Real intimacy
requires riskthe risk of disapproval, of heartache, of being thought a fool. Social networking
websites may make relationships more reliable, but whether those relationships can be humanly
satisfying remains to be seen.

In-class resources for Thu Dec 3 (class 20)


Why we expect more from technology and less from each other
Digerati video and reading
(discussion leader: x)
Digerati
Sherry Turkle
@sturkle
https://en.wikipedia.org/wiki/Sherry_Turkle
Video
We will watch and discuss this in class.
Sherry Turkle on Colbert Report (6:45)
http://thecolbertreport.cc.com/videos/kd5rmr/sherry-turkle
Reading
We will read and discuss this in class.
Reclaiming Conversation: The Power of Talk in the Digital Age
(Sherry Turkle, 2015)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
What is social medias effect on romance?
What is social medias effect on friendship?
Is Facebook making us unhappy?
Is Facebook turning us into narcissists?
Is Facebook doomed?
Why we love snapchat
Hold the press. The whole social media is hurting us thing is overblown.
Sherry Turkle

Reclaiming Conversation: The Power of Talk in the Digital Age


(Sherry Turkle, 2015)
to come article

with coming soon signSunday Story #11


Due Sun Dec 6 (before midnight)
Prompt
medium 20 links

Homework for Tue Dec 8 (class 21)


The App Generation
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: Digital Natives
Excerpts: The App Generation (Katie Davis and Howard Gardner, 2015)
Video
Watch the assigned video on Zaption and post a one-sentence response.
Jonathan Safran Foer (21:00) watch at least half of this video
http://zapt.io/tpu3tzaj
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Davis: App Generation excerpts
link

Net Generation Norms


Don Tapscott, Grown Up Digital (2010)
They want freedom in everything they do, from freedom of choice to freedom of
expression. We all love freedom, but not like this generation. Choice is like oxygen to them.
While older generations feel overwhelmed by the proliferation of sales channels, product
types, and brands, the Net Gen takes it for granted. Net Geners leverage technology to cut
through the clutter and find the marketing message that fits their needs. They also expect to
choose where and when they work. Net Geners seek the freedom to change jobs, freedom to
take their own path, and to express themselves.
They love to customize, personalize. When I was a kid, I never got to customize The
Mickey Mouse Club. Todays youth can change the media world around them their desktop,
Web site, ring tone, handle, screen saver, news sources, and entertainment. They have grown
up getting what media they want, when they want it, and being able to change it. Millions
around the world dont just access the Web, they are creating it by creating online content.
They are the new scrutinizers. When I was young, a picture was a picture. No more.
Transparency, namely stakeholder access to pertinent information about companies and their
offerings, just seems natural to the Net Gen. While older generations marvel at the consumer
research available on the Internet, the Net Gen expects it. As they grow older, their online
engagement increases. Businesses targeting the Net Gen should expect and welcome intense
scrutiny of its products, promotional efforts, and corporate practices.
They look for corporate integrity and openness when deciding what to buy and
where to work. The Internet, and other information and communication technologies, strip
away the barriers between companies and their various constituencies, including consumers,
activists, and shareholders. Whether consumers are exposing a flawed viral marketing
campaign or researching a future employer, Net Geners make sure company values align with
their own.
The Net Gen wants entertainment and play in their work, education, and social life.
This generation brings a playful mentality to work. From their experience in the latest video
game, they know that theres always more than one way to achieve a goal. This outside-the-box
thinking results from 82 percent of American children aged 2 to 17 having regular access to
video games. Its a fast-growing industry: in the United States, video game sales were $8.4
billion in 2005, with worldwide sales expected to hit $46.5 billion by 2010.
They are the collaboration and relationship generation. Today, youth collaborate on
Facebook, play multiuser video games; text each other incessantly; and share files for school,
work, or just for fun. As evidenced by sites such as Yub.com, they also engage in relationship--oriented purchasing. Nine out of ten young people we interviewed said that if a best friend
recommends a product, they are likely to buy it.
The Net Gen has a need for speed and not just in video games. Real-time chats with a
database of global contacts have made rapid communication the new norm for the Net
Generation. In a world where speed characterizes the flow of information among vast
networks of people, communication with friends, colleagues, and superiors takes place faster
than ever. And marketers and employers should realize that Net Geners expect the same
quick communication from others every instant message should draw an instant response.
They are the innovators. When I was young, the pace of innovation was glacial. Today its
on hyperdrive. A twentysomething in the workforce wants the new BlackBerry, Palm, or iPhone
not because the old one is no longer cool, but because the new one does so much more. They
seek innovative companies as employers and are constantly looking for innovative ways to
collaborate, entertain themselves, learn, and work.

The Dark Side


Don Tapscott, Grown Up Digital (2010)
There concerns and criticisms of this generation that are voiced by everyone from parents to
frustrated employers. Many academics, journalists, and pundits present skeptical, negative,
even cynical views of the Net Generation. They're dumber than we were at their age. You
hear different variations of this popular theme. They don't know anything, writes Mark
Bauerlein in The Dumbest Generation: Net Geners are a portrait of vigorous, indiscriminate
ignorance." From novelist Robert Bly: "Today we are lying to ourselves about the renaissance
the computer will bring. It will bring nothing. What it means is that the neo---cortex is finally
eating itself." They don't read and are poor communicators. All this time online is reflected in
the schools and universities where they perform badly on tests.
They're screenagers, Net addicted, losing their social skills, and they have no time for
sports or healthy activities. Time spent online could have been devoted to sports and face--to---face conversation; the result is a generation of awkward, fat people. And when they get
addicted to video games, some say, the results can be worse. Mothers Against Videogame
Addiction and Violence (MAVAV) describes video games as "the world's fastest growing
addiction and the most reckless endangerment of children todaycomparable to drug and
alcohol abuse."
They have no shame. "It is pretty routine these days for girls to post provocative pictures
of themselves online," warns M. Gigi Durham, the thoughtful author of The Lolita Effect.
Young people, unaware that it may come back to haunt them, merrily give out all sorts of
personal information online, whether it's to a college recruiter, a future employer, or to a
manipulative marketer, cyberbully, or predator.
Because their parents have coddled them, they are adrift in the world and afraid
to choose a path. That's why so many of them move home after college. They really can't
cope with the independence. According to William Damon, author of The Path to Purpose,
"Youth are so afraid of commitment that many of them may never marry, and they're so
uncertain about picking a career that they may wind up living at home forever."
They steal. They violate intellectual property rights, downloading music, swapping songs,
and sharing anything they can on peer---to---peer networks with no respect for the rights of
the creators or owners. "When you go online and download songs without permission, you
are stealing," the Recording Industry Association of America says on its Web site. The ease
with which the Net Gen uses the Intern pet has also made them masters of plagiarism.
They have no work ethic and will be bad employees. William Damon, in The Path to
Purpose, says that students today are drifting aimlessly, with no clue as to what they want
to do or become in the future. They are "slackers" who have a sense of entitlement, and as
they enter the workforce they are placing all kinds of unrealistic demands on employers for
everything from sophisticated technology to new approaches to management.
This is the latest narcissistic "me" generation. "They are far more narcissistic than
students were 25 years ago," says Jean Twenge, the professor who reviewed college
students' responses to the Narcissistic Personalty Inventory between the early 1980s and
2006. "Current technology fuels an increase in narcissism," she said.
They don't give a damn. They have no values and they don't care about anyone else.
Their only interests are popular culture, celebrities, and their friends. They don't read
newspapers or watch television news. They get their news from The Daily Show with Jon
Stewart on Comedy Central. They don't vote and are not involved in civil society.

Peer Absorption
Mark Bauerlein, The Dumbest Generation (2008)
The enhanced connectivity, and the indulgence of teachers and journalists, feed yet
another adolescent vice that technophiles never mention: peer absorption. Educators
speak about the importance of role models and the career pressures facing kids, but in
truth, ado plescents care a lot more about what other adolescents think than what their
elders think. Their egos are fragile, their beliefs in transition, their values uncertain.
They inhabit a rigorous world of consumerism and conformity, of rebellious poses and
withering group judgments. Boys struggle to acquire the courage and strength of
manhood, girls the poise and strength of womanhood. They tease one another
mercilessly, and a rejection can crush them. Life is a pinball game of polarized demands
a part-time job that requires punctuality and diligence, pals who urge them to cut up
in class; a midterm forcing them to stay home and study, a friend who wants to catch a
horror flick. For many of them, good standing with classmates is the only way to secure
a safe identity, and so they spend hours on the channels of adolescent fare searching
out the latest in clothes, slang, music, sports, celebrities, school gossip, and one
another. Technology has made it fabulously easier.
Groupthink and the peer-to-group phenomenon
Tom Zeller, The New York Times (2010)
"What's hard to measure is the impact of groupthink, of group mentality, and the tendency of
what we might call the democratization of social interaction and how that changes this
generation's relationship with almost everything they come in contact with. With the technology,
the Internet in terms of being able to facilitate the social networking there's just so much
ability to quickly transfer information." He calls it the peer-to-group phenomenon a digital-age
manifestation of the grapevine. "When someone wants to share it, forward it, record it, take a
picture of it, whatever the case may be, that puts it into a form of currency." "You've got a group
of kids who are unbelievably, incredibly loyal to each other," Dr. Levine said. "They are very
bound to ethics and values. But in a funny sort of way, it prevents some of them from developing
as individuals." Along with finding technological dexterity in this group, and a highly developed
ability to work in team settings, Dr. Levine said he had encountered concerns that some young
people lacked the ability to think and plan for the long term, that they withered without
immediate feedback and that the machinery of groupthink had bred a generation flush with loyal
comrades but potentially weak on leaders.
The culture war: How new media keeps corrupting our children
Tom Standage, WIRED (2006)
US senator Charles Schumer says some videogames aimed at kids "desensitize them to death
and destruction." But dire pronouncements about new forms of entertainment are old hat. It
goes like this: Young people embrace an activity. Adults condemn it. The kids grow up, no better
or worse than their elders, and the moral panic subsides. Then the whole cycle starts over.
Here's how the establishment has greeted past scourges.
Videogames"The disturbing material in Grand Theft Auto and other games like it is stealing the
innocence of our children and it's making the difficult job of being a parent even harder ... I
believe that the ability of our children to access pornographic and outrageously violent material
on video games rated for adults is spiraling out of control."- US senator Hillary Rodham Clinton,
2005
Rock and Roll"The effect of rock and roll on young people, is to turn them into devil
worshippers; to stimulate self-expression through sex; to provoke lawlessness; impair nervous
stability and destroy the sanctity of marriage. It is an evil influence on the youth of our country."
- Minister Albert Carter, 1956

Novels"The free access which many young people have to romances, novels, and plays has
poisoned the mind and corrupted the morals of many a promising youth; and prevented others
from improving their minds in useful knowledge. Parents take care to feed their children with
wholesome diet; and yet how unconcerned about the provision for the mind, whether they are
furnished with salutary food, or with trash, chaff, or poison?"- Reverend Enos Hitchcock, Memoirs
of the Bloomsgrove Family, 1790
Movies"This new form of entertainment has gone far to blast maidenhood ... Depraved adults
with candies and pennies beguile children with the inevitable result. The Society has prosecuted
many for leading girls astray through these picture shows, but GOD alone knows how many are
leading dissolute lives begun at the 'moving pictures.'" - The Annual Report of the New York
Society for the Prevention of Cruelty to Children, 1909
The Telephone"Does the telephone make men more active or more lazy? Does [it] break up
home life and the old practice of visiting friends?" - Survey conducted by the Knights of
Columbus Adult Education Committee, San Francisco Bay Area, 1926
Comic Books"Many adults think that the crimes described in comic books are so far removed
from the child's life that for children they are merely something imaginative or fantastic. But we
have found this to be a great error. Comic books and life are connected. A bank robbery is easily
translated into the rifling of a candy store. Delinquencies formerly restricted to adults are
increasingly committed by young people and children ... All child drug addicts, and all children
drawn into the narcotics traffic as messengers, with whom we have had contact, were inveterate
comic-book readers This kind of thing is not good mental nourishment for children!" - Fredric
Wertham, Seduction of the Innocent, 1954

1
With respect to identity formation: Apps can short-circuit identity formation, pushing you into
being someone else's avatar (that of your parents, your friends, or one formulated by some app
producer)or, by foregrounding various options, they can allow you to approach identity
formation more deliberately, holistically, thoughtfully. You may end up with a stronger and more
powerful identity, or you may succumb to a prepackaged identity or to endless role diffusion.
With respect to intimacy: Apps can facilitate superficial ties, discourage face-to-face
confrontations and interactions, suggest that all human relations can be classified if not predetermined in advanceor they can expose you to a much wider world, provide novel ways of
relating to people, while not preventing you from shutting off the devices as warrantedand that
puts you in charge of the APPS rather than vice versa. You may end up with deeper and longerlasting relations to others, or with a superficial stance better described as cool, isolated, or
transactional.
With respect to imagination: Apps can make you lazy, discourage the development of new skills,
limit you to mimicry or tiny trivial tweaks or meetsor they can open up whole new worlds for
imagining, creating, producing, remixing, even forging new identities and enabling rich forms of
intimacy.
2
The rival brand of psychology, which came into prominence during Howard's own professional
lifetime, is called cognitivisin or constructivism.'" On this view, skills and knowledge arc
constructed on the basis of the individual's own active explorations of the cn vironment Rewards
supplied by others are fine, but the most important activities are ones that are intrinsically
rewardingbased on one's own discovered pleasures as one explores the world. Imitations and
modeling are am.' tests," less kindly called "drill and kill." in sharp contrast, constructivists call
for rich and inviting problems and puzzles, which will engage curiosity .and catalyze extensive
explorationwith, at most, the "guide on the side," rather than the "sage on the stage? On the
constructivist view, the best way to educate is to provide inviting materials and get out of the
way.
As for the probability of these various alternatives, heated debate already exists in the writings
of the digerati. On the one side we find unabashed enthusiasts of the digital world. In the view of
experts like danah boyd, Cathy Davidson, Henry Jenkins, Clay Shirley} and David Weinberger,
the digital media hold the promise of ushering in an age of unparalleled democratic participation,
mastery of diverse skills and areas of knowledge, and creative expression in various media,
singularly or orchestrally." As they see it, for perhaps the first time in human history, it is
possible for each of us to have access to the full range of information and opinions, to inform
ourselves, to make judicious decisions about or our own lives, to form links with others who want
to achieve similar goalsbe they political, economic, or culturaland to benefit from the
enhanced intelligence and wisdom enabled by a vast multi-networked system. On this
perspective, a world replete with apps is a world in which endless options arise, with at least the
majority tilted in positive, world-building, personally fulfilling directions. It's a constructivist's
dream.
3
The rival brand of psychology, which came into prominence during Howard's own professional
lifetime, is called cognitivism or constructivism." On this view, skills and knowledge are
constructed on the basis of the individual's own active explorations of the en vironmcnt, Rewards
supplied by others are fine, but the most important activities are ones that are intrinsically
rewardingbased on one's own discovered pleasures as one explores the world. Imitations and
modeling are and tests," less kindly tcrnicd "drill and kill." Jig sharp contrast, constructivists call
for rich and inviting problems and puzzles, which will engage curiosity tired catalybe extensive
explorationwith, at most, the "gelid u on the side," rather than the "sage on the stage." On the
consrructivist view, the best way to educate is to provide inviting materials and get out of the
way.
Others are less sanguine. Nicholas Carr claims that, with their speed and brevity, the digital
media encourage superficial thinking, thereby thwarting the sustained reading and reflection
enabled broadly by the Gutenberg era." Raising the stakes, Mark Bauerlein invokes the

inflammatory epithet the dumbest generation."' Cass Sunstein fears that the digital media
encourage us to consort with like-minded persons; far from exposing us to a range of opinions
and broadening our horizons, the media enableor, more perniciously, dictatethe creation of
intellectual and artistic silos or echo clambers.' Sherry Turkle worries about an increasing sense
of isolation and the demise of open, exploratory conversations while Jaron Lanier laments
threats to our poetic, musical, and artistic souls.' On this perspective, an app-filled world brings
about dependence on the particulars of each currently popular app, and a general expectation
that one's futureindeed, the future itselfwill be dictated by the technological options of the
time. It's a constructivist's nightmare.
4
The situation could not be more different from that which obtains today. Howard has taught
students intermittently in the 1960s and 1970s and regularly ever since. With every passing
decade, it appears to Howard that students look increasingly to their teachersand more
broadly, to their supervisors and their mentorsfor the correct way, for what is wanted, for the
route to an "A," to approval, to a positive letter of recommendation, smoothing the way to the
next step on the ladder of success. There's more. Many students convey the impression that the
authority figures know just what they want from their charges; that they could be straightforward
and say what is wanted; and that they are being irresponsible, delinquent, unfair, and even
unethical in withholding the recipe, the road map. The light-hearted version of this attitude is the
all-too-familiar question, Will this be on the exam?" The nuts-and-bolts version is, "Just tell us
what you want and we will give it to you."
5
Back to our story about the generations, but with an unexpected twist. In mid-twentieth-century
America, generations were routinely spoken of in terms of their defining political experiences or
powerful cultural forces. Only in recent memory has characterization of a generation taken on a
distinctly technological flavor. In his studies of successive waves of college students, Arthur
Levine (with colleagues) has discerned a revealing trend. Students in the latter decades of the
twentieth century characterized themselves in terms of their common experiences vis--vis the
Kennedy assassination, the Vietnam War, the Watergate burglary and investigation, the shuttle
disaster, the attack on the Twin Towers in September zoor. But one the opening years of the
twenty-first century had passed, political events increasingly took a back seat. Instead, young
people spoke about the common experiences of their generation in terms of the Internet, the
web, handheld devices, and smartphones, along with the social and cultural connections that
they enabledmost prominently, the social networking platform Facebook.
6
THE APPS ARRAYED ON a person's sniartphone or tablet represent a fingerprint of sortsonly
instead of a unique pattern of ridges, it's the combination of interests, Fiabits, and social connections that identify that person. A news app might he sandwiched between a fantasy sports
app and a piano keyboard app, revealing multiple facets of one's identity. Because many of these
apps provide access to various online communities, each facet allows the owner to find ready
communion with similarly oriented people. Though the range of self-expression is great online,
it's not unrestricted. For instance, expressions are limited to 140 characters on Twitter, whereas
digitally manipulated photos are the coin of the realm on Instagram. The app identity, then, is
multifaceted, highly personalized, outward-facing, and constrained by the programming decisions of the app designer. Just how are youth's identities shaped and expressed in the age of the
app? Are they truly different or just superficially so?
7
Our focus group participants believe that the identities of today's App Generation are more
externally oriented than the identities of predigital youth. For the affluent youth, their focus
largely rests on presenting a polished, packaged self that will meet the approval of college
admissions officers and prospective employers. They appear to regard themselves increasingly
as objects that have quantifiable value to others: an SAT score, a GPA, a collection of varsity
letters, trophies, community service certifications, or other awards. One religious leader echoed
the sentiments of the other participants in his focus group when he said that, for many young
people, Who am I?" means "What am I going to produce?"

Accompanying this sensibility is a calculated effort to maximize one's value in order to achieve
academic and professional success. One participant in a focus group said that when youth are
asked what their hopes are, they give "pragmatic, achievable answers" situated in the present or
near future such as "a good job" or "a good relationship" more often than was the case with
youth from earlier generations. During our conversation with the therapists, a participant
declared that many of today's young people suffer from a "planning delusion"a (mistaken) faith
that if they make careful, practical plans, they will face no future challenges or obstacles to
success.
8
The pragmatic, careerist focus of today's college students occurs within the context of a broader
societal trend toward individualism and away from a more community-minded, institutiona!
orientation. In his landmark book Bowling Alone, political scientist Robert Putnam shows that
Americans' participation in various civic institutions, such as bowling leagues, labor unions, and
church organizations, has declined steadily across cohorts born after World War II. As these
community ties loosen, they're replaced by a "moral freedom" that allows individuals to define
for themselves the meaning of a virtuous life and doesn't require them to sacrifice their personal
needs and desires in the process.
9
One psychologist expressed concern about young persons' constant self-projection and selftracking online, which she says leaves them with little time for private contemplation or identity
construction. She worries that, as a result, the prominence of their internal sense of self (in
Riesman's term, "inner-directedness") is dwindling, perhaps to the point of nonexistence.
This lament about the lack of time for quiet reflection has become a common theme among
academics and the popular press. Researchers have identified a number of benefits that accrue
when a brain is at rest (relatively speaking) and focused inward.. The downtime appears to play
a restorative role, promoting feelings of well-being and, ultimately, helping individuals to focus
their attention more effectively when it's needed. Daydreaming, wandering, and wondering have
positive facets. Introspection may be particularly important for young people who are actively
figuring out who and what they want to he. -Without time and space to ponder alternative ways
of being in the worldwithout breaking away from an app-determined life pathyoung persons
risk prematurely foreclosing their identities, making it less likely that they will achieve a fully
realized and personally fulfilling sense of self.''
10
The question of the Internet's impact on self-focus has also become a popular focus among social
scientists, who've generally observed a positive connection between narcissism and online
behavior. 2 For instance, one study found that people with high narcissism scores were more
likely to post self-promoting content and engage in high levels of social activity on Facebooka''
Another found that college students with high narcissism scores were more likely to tweet about
themselves. The authors of this study caution that while youth's online behavior may appear
narcissistic to an outsider's eye, it's important to keep in mind that their primary motivation for
going online may well be not to promote themselves but rather to maintain and nurture their
social ties. (We'll examine the social dimension of youth's online lives in the next chapter.) Still,
it's worth noting that about 30 to 40 percent of ordinary conversation consists of people talking
about themselves, whereas around go percent of social media updates are self-focused. Also
important is the fact that we can't determine in which direction the arrow of causality points.
Does Internet use cause narcissism, or do narcissistic people use the Internet in distinctive ways?
11
Given the self-focus of narcissists, one might assume that they're self-assured and unaffected by
the goings-on of others. This turns out not to be the case. As Sherry Turkle explains in her book
Alone Together, "In the psychoanalytic tradition, one speaks about narcissism not to indicate
people who love themselves, but a personality so fragile that it needs constant support." Instead
of self-assuredness, then, narcissists tend more toward a fragile self that needs propping up by
external reassurances. Jean Twenge's research bears this out. Along with rising levels of
narcissism among youth, she finds increasing moodiness, restlessness, worry, sadness, and
feelings of isolation. In sharp contrast to Riesman's inner-directed persons, today's young people

are also more likely to feel that their lives are controlled by external social forces rather than
growing out of an internal locus of control. Consistent with Twenge's findings, researchers at the
University of California at Los Angeles found that the percentage of first-year college students
who said that they frequently felt "overwhelmed by all I had to do" during their senior year of
high school increased from 8 percent in 1985 to 30 percent in 2010.
12
Several of our participants identified a similar incongruity between youth's external polish and
their internal insecurities. The camp directors we interviewed told us that campers today
demonstrate more self-confidence in what they say they can do but are less willing to test their
abilities through action. They attributed this shift to youth's growing distaste for taking any
tangible risk that could end in failure failure that once might have been witnessed by a few peers
and then forgotten but today might become part of one's permanent digital footprint.
The themes of growing anxiety and aversion to risk surfaced in other focus groups. One therapist
reflected that youth seem reluctant to engage in certain endeavors for fear of feeling anxious or
depressed if they don't go as planned. Indeed, many participants agreed that young people's
identities are defined by insecurity and disequilibrium. The religious leaders remarked that youth
today are generally more fearful about their future. "Even the most confident Harvard
grad.shared one participant, "is ... scared to death." The therapists observed that, to cope with
this fear, many young persons display a notable lack of affect and an apparent goal to "feel
nothing." Citing a word all too familiar to parents of today's teenagers, one participant called
today's youth the whatever' generation."
13
Turkle uses the metaphor of a tether to suggest that youth's constant connections to their digital
devices and the people accessible through them weaken their ability to develop an autonomous
sense of self. These technologies encourage youth to look outside themselves for reassurance, in
matters both mundane and existentiol, Indeed, their thoughts and feelings don't seem real until
confirmed by others. This argument is supported by empirical evidence showing that college
students who use their digital devices to maintain frequent contact with their parents tend to be
less autonomous. In the spirit of Turkle, some scholars have invoked the concept of psychasthenia in an effort to explain how people's online presence can weaken their sense of self to the
point of full renunciation.
14
Still, many peopleincluding youthare optimistic about the Internet's power to expand our
horizons and enrich our lives. In his book Here Comes Everybody, Clay Shirley suggests that the
bowling leagues, lodges and rotary clubs of the fifties and sixties have not simply vanished;
rather they have been replaced by a far greater number of online communities representing a
wider range of interests. No matter how obscure one's interest, it can nd expression and
validation online, whether it be down the street or halfway around the globe.4is For young
people, this access to digital alter-egos" means that their identities as fan girls, garners, chess
players, or knitters don't have to be set aside to fit into a narrow peer culture."
15
What are teens saying through their apps, and to whom? As it turns Out, a considerable portion
of teens' computer-mediated communication is dedicated to making (and sometimes breaking)
on-the-fly arrangements to meet up with their friends in person. In one of our studies, we asked
teens what they would miss most about not having a cell phone. Sixteenyear-old Justin
answered, "just being able to make plans on the go, and stuff, because me and my friends, we
don't really plan things. We just go out." The app mentality supports the belief that just as
information, goods, and services are always and immediately accessible, so too are people.
Scholars in the mobile communication field have dubbed such in-the-moment planning
"microcoordination" and observe that it can slide into "hypercoorclination" when teens start to
feel left out of their social circles if separated from their mobile devices for any period of time.'
16
In Our focus groups, we learned that many of today's youth consider it less intrusive to send a
text rather than call someone, and its not uncommon for them to end relationships through text

message or Facebook rather than in person. Similar to this is the phenomenon of text
cancellations that many of us apparently now rely on to break plans with others at the last
minute." Turkle contends that this sort of arm's-length way of conducting relationships ultimately
empties them of true intimacy. She warns: "There is the risk that we come to see others as
objects to be accessedand only for the parts we find useful, comforting, or amusing." This
emptying out of intimacy is likely what one focus group participant had in mind when she
observed tellingly: "Kids are more and more connected, but less and less really connected."
17
There may be another way in which new media technologies remove the vulnerability from our
interpersonal relationships and distance us from each other. In a provocative and much debated
op-ed, "How to Live without Irony," scholar Christy Wampole observes a strong ironic sensibility
among today's generation of youth.) In her rendition, young people wear Justin Bieber T-shirts
ironically, watch Glee ironically, and give each other birthday gifts ironically. By bathing their
actions and interactions in a wash of sarcasm, young people distance themselves both from their
actions and from other people. According to Wampole, the Internet supportsindeed,
encourages this ironic turn. Online, the actions of public figures are transformed instantly into
derisive memes and circulated widely. The addition of a witty ha shag at the end of a tweet
empties it instantly of any seriousness. This sensibility is reinforced nightly on TVand
subsequently posted, shared, and tweeted about onlineby Jon Stewart and Stephen Colbert,
who wryly ridicule newscasters, politicians, and other well-known personalities. By turning
everything into a joke, youth risk nothing because they make nothing of themselves vulnerable.
Yet vulnerability is precisely what's needed to connect with other people in an honest and
meaningful way.
18
Another app, Facetime (Apple's answer to Skype) can also be used to illustrate the ease of falling
into a transactional rather than a transformational interpersonal exchange online. When Katie
and Molly first talked remotely using Facetime, the first thing Katie noticed was that genuine eye
contact is impossible. If you want the other person to feellike you're looking them in the eyes,
then you have to look into the camera, not their eyes. In other words, to create the illusion of
eye contact one must actively avoid it. Something else that Katie noticed instantly was her own
image in the corner of the screen. She found it hard not to glance over at it periodically, which
turned her attention away from Molly and onto herself. Apparently, Molly was equally, if not more
so, enticed by the "Narcissus trap." In fact, at one point in their conversation, Katie was confused
when she made a funny face but Molly didn't react in the slightest. When Katie called her on it,
Molly admitted somewhat sheepishly that she'd been focusing on her own image and facial
expressions instead of her sister's. Overall, Katie's experiences with Facetime, Skype, and
Hangouts.
Google have led her to conclude that, while it's great to be able to connect with others across
distances, it's difficultif not impossibleto achieve the level of deep, warm connection that
face-to-face contact provides.
19
A similar dynamic may he playing out on the Internet. In The Filter Bubble, Eli Pariser explains
how search engines and social network sites show us only what we want to see (or what they
think we want to see). He uses Facchook's Edge-Rank as one example of how this works.
EdgeRank uses an algorithm to rank users' friends list according to how much interaction each
user has with each person on the list. EdgeRank then uses that ranking to structure users'
newsfeeds so they see more from the friends at the top of the list. Google's search algorithm
works in a similar way such that two people conducting an identical Google search (whether it's
"performing arts in Atlanta" or "2012 presidential election") will be shown a different set of
results based on what Google knows about them (and drawing on previous search history, Grnail
contacts and exchanges, YouTube posts and viewing habit pole knows a lot!). Pariser argues that
such algorithms have a silo-in effect, causing us to encounter only like-minded people and ideas
online. It's difficult to empathize with perspectives that we never see.
20

We find that digital. media open up new avenues for youth to express themselves creatively.
Rernix, collage, video production, and music compositionto name just a few popular artistic
genres of the dayare easier and cheaper for today's youth to pursue than were their predigital
counterparts,. It's also easier to find an audience for one's creative productions. The app
metaphor serves us well here, since apps are easy to use, support diverse artistic genres, and
encourage sharing among their users.
And yetreflecting patterns we observed in youth's expressions of personal identity and
experiences of intimacyan app mentality can lead to an unwillingness to stretch beyond the
functionality of the software and the packaged sources of inspiration that come with a oogle
search. We ask: Under what circumstances do apps enable imaginative expression? Under what
circumstances do they foster a dependent or narrow-minded approach to creation?
21
To complicate matters further, we also spoke with art teachers (visual art, music, and performing
arts) who'd been teaching for at least twenty years and therefore could reflect on changes
they've observed in students' imaginative processes over time. Though these teachers
celebrated the broad range of creative opportunities now open to today's youth (which we
discuss in greater detail below), several arts educators observed that today's students have
more difficult)? in coming up with their own ideas; they're far more comfortable engaging with
existing ones. One participant reflected: "Some of the most artistically skilled kids cannot come
up with an idea. They've got full scholarships to Mass Art [Massachusetts College of Art and
Design] and they can't come up with an idea. . . They go to their laptop first.. .. I find that I'm
constantly shoulder to shoulder asking what do you see? What does it mean? ... They're thinking
too much or saying have nothing." Moreover, when they do come up with their own. ideas, they
often have difficulty executing them, particularly in the absence of clear "executive assistants."
Said another participant, "Before, they used to jump in and see where the materials would take
them, now they ask what to do,"
23
In his book You Are Not a Gadget, computer scientist and cultural critic jaron Lanier bemoans the
effects o f remix on individual creativity: "Pop culture has entered into a nostalgic malaise. Online
culture is dominated by trivial mash ups of the culture that existed before the onset of mashups,
and by fandom responding to the dwindling outposts of centralized mass media. It is a culture of
reaction without action.
24
In his book Cognitive Surplus, Clay Shirley celebrates digital media's ability to connect people
easily, quickly, and cheaply." Drawing on examples like the Impressionist painters, who lived and
worked together in southern France, Mirky argues that collaboration is a central component of
creativity. Where collaboration is supported and encouragedas it surely is onlinecreativity will
thrive.
25
As we've discussed in this chapter, however, the act of creation is circumscribed by the app's
underlying code and the developer who wrote it; to paraphrase Lawrence Lessig, the code
determines the creation. A specific hue of green may not be included in one's painting app; the
piccolo might be missing from the music app. Users have little choice but to work within these
limitations. The avenues to artistic expression may be many in thc app era, but they're often
tightly bounded.
26
"Civilization advances by extending the
number of important operations which we can
perform without thinking about them."
Alfred North Whitehead
At the head of the chapter, we've affixed a statement by the philosopher Alfred North
Whitehead, Though it may be well known among the digerati (Howard first heard it quoted by a
leading technologist), we had not encountered it until we were putting the finishing touches on

this book. At first blush, the statement sounds just right. One finds oneself nodding in agreement
yes, we value those inventions that allow us to make habitual those thoughts and actions that
could assume much time and effort. And indeed, we can think of many manmade devices
(ranging from the creation of script to the invention of the credit card) that have allowed us to
simplify formerly complex operations and to move on to other things. Is civilization even
imaginable without a multitude of labor-saving devices that free our hands and minds? Thank
goodness for the "flywheel of civilization"!
Yet on reflection, -Whitehead's statement seems increasingly dual-edged to us. For sure, most of
us would like to automatize as much as possibleour psychological antagonistsbehaviorists
and constructivistswould agree. But do we want to automatize everything? And who decides
what is important? And where do we draw the line between an operation and the content on
which the operation is carried out?
27
Of course, other potent factors are at work. In this book, we have not spoken much about the
ambition and reach of vast multinational corporations or of totalitarian states. For every major
medium of communication that began as the product of human imagination, one can tell a story
of how megacorporations eventually came to dominate the media and to determine how human
beings interacted with them. Google, Apple, Amazon, and their less prominent peers have
tremendous power and access to data of a size and scale that not even the most imaginative
science fiction writersH. G. Wells, Jules Vernecould have anticipated a century ago. It would
be a brave person who would predict that the fate of corporation-devised and -sold apps would
be different; and it would be a naive person who would simply assume that such power will
inevitably be dedicated to benign uses.
28
We must also acknowledge the possibility of powers even greater than those associated with
megacorporations and powerful political entities. As we come to understand better our genetic
and neurological nature, there will. be attempts to reconfigure our species, more or less
aggressively, and to usher in a so-called singularity, in which the lines between computer and
brain, machine and human, mortality and immortality become blurred or blended or disappear
altogether. As more than one wag has put it, "The question is no longer, 'Are computers like us?'
hut rather, Are we like computers?" To the extent that these impulses are realized, human
tendencies to resist or transcend apps will evaporate. Just as surely as the reach of Big brother in
1984 or the programming of Alex's brain in Clockwork Orange, apps will come to control our
lives.
29
With essayist Christine Rosen, we worry about the "ultimate efficiencyhaving one's needs and
desires foreseen and the vicissitudes of future possible experiences controlled." With poet Allen
Tate, we spurn a world in which "we no longer ask is it right,' we ask 'does it work?'"
As authors, we get the privilege of last words. For ourselves, and for those who come after us as
well, we desire a world where all human beings have a chance to create their own answers,
indeed, to raise their own questions, and to approach them in ways that are their own.

In-class resources for Tue Dec 8 (class 21)


The App Generation
Digerati video and reading
(discussion leader: x)
Digerati
Katie Davis
@katiebda
BHSEC Q guest author (Spring 2015)
http://katiedavisresearch.com
Video
We will watch and discuss this in class.
Katie Davis on UW360 (4:00)
https://www.youtube.com/watch?v=lv1efiDyd-Y
Reading
We will read and discuss this in class.
The App Generation: How Todays Youth Navigate Identity, Intimacy, and Imagination in a Digital World
(Katie Davis and Howard Gardner, 2013)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
Digital Natives
Katie Davis

Homework for Thu Dec 10 (class 22)


How does the internet affect parenting?
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: #digitalvertigo (Andrew Keen, 2011)
Video
Watch the assigned videos on Zaption and post a short response to each.
Louis C.K.(4:20)
http://zapt.io/tg6unf3n
Troubled Teen (5:00)
http://zapt.io/thzjeapt
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
How does the internet affect parenting?
link

Use as prompt:
"Always on" is exhausting
Douglas Rushkoff, Program or Be Programmed: Ten Commands for the Digital Age (2010)
As Internet connections grow faster, fatter, and freer, however, we are more likely to
adopt an always on approach to media. Our broadband connectionswhether in our
homes or in our phoneskeep our applications on, updating, and ready at every moment.
Anytime anyone or anything wants to message, email, tweet, update, notify, or alert us,
something dings on our desktop or vibrates in our pocket. Our devices and, by extension, our
nervous systems are now attached to the entire online universe, all the time. Is that my phone
vibrating? We scramble to keep up with the never-ending inflow of demands and commands,
under the false premise that moving faster will allow us to get out from under the endless stream
of pings for our attention. For answering email and responding to texts or tweets only
exacerbates the problem by leading to more responses to our responses, and so on. We strive to
multitask, attempting to give partial attention to more than one thing at a time, when all we
really do is move as quickly as possible from one task to another. No matter how proficient we
think we are at multitasking, studies show our ability to accomplish tasks accurately and
completely only diminishes the more we try to do at the same time. This is not the fault of digital
technology, but the way we use it. The results arent pretty. Instead of becoming empowered and
aware, we become frazzled and exhausted. We have no time to make considered responses,
feeling instead obligated to reply to every incoming message on impulse. We reduce the length
and complexity of our responses from paragraphs to sentences to txts, making almost
everything we transmit sound like orders barked over a walkie-talkie in a war zone. Everything
must happen right away or, better, now. There is no later.
Technology and the overworked American
David Shenk, Data Smog: Surviving the Information Glut (1997)
Harvard economist Juliet Shor reports that technology was predicted to save us from excessively
long hours. But, as Shor points out in her book, The Overworked American, we are working 164
hours more per year than we did 20 years ago. "Technology", reports Shor, "reduces the amount
of time it takes to do any one task but also leads to the expansion of tasks that people are
expected to do. It's what happens to people when they get computers and faxes and cellular
telephones and all of the new technologies that are coming out today."

Excerpts from #digitalvertigo by Andrew Keen (2012)


1
@alexia: We would have lived our lives differently if we had known they would one day be
searchable.
2
Rather than virtual or second life, social media is actually becoming life itselfthe central and
increasingly transparent stage of human existence, what Silicon Valley venture capitalists are
now calling an internet of people. As the fictionalized version of Facebook president Sean
Parkerplayed with such panache by Justin Timberlake predicted in the 2010 Oscar-nominated
movie The Social Network: We lived in farms, then we lived in cities, and now were gonna live
on the Internet! Social media is, thus, like home; it is the architecture in which we now live.
3
Nicholas Carr, one of todays most articulate critics of digital utilitarianism argues: What we
dont share is as important as what we do share.
4
In 1787, at the dawn of the mass industrial age, Jeremy Bentham designed what he called a
simple idea in architecture to improve the management of prisons, hospitals, schools and
factories. Benthams idea was, as the architectural historian Robin Evans noted, a vividly
imaginative synthesis of architectural form with social purpose. Bentham, who amassed great
personal wealth as a result of his social vision, wanted to change the world through this new
architecture. Bentham sketched out this vision of what Aldous Huxley described as a plan for a
totalitarian housing project in a series of open letters written from the little Crimean town of
Krichev, where he and his brother, Samuel, were instructing the regime of the enlightened
Russian despot Catherine the Great about the building of efficient factories for its unruly
population. In these public letters, Bentham imagined what he called this Panopticon or
Inspection-House as a physical network, a circular building of small rooms, each transparent
and fully connected, in which individuals could be watched over by an all-seeing inspector. This
inspector is the utilitarian version of an omniscient godalways-on, all-knowing, with the
serendipitous ability to look around corners and see through walls. As the French historian
Michel Foucault observed, this Inspection House was like so many cages, so many small
theaters, in which each actor is alone, perfectly individualized and constantly visible. The
Panopticons connective technology would bring us together by separating us, Bentham
calculated. Transforming us into fully transparent exhibits would be good for both society and the
individual, he adduced, because the more we imagined we were being watched, the more
efficient and disciplined we would each become.
5
(The internet) is finally realizing (Jeremy Benthams) utilitarian dream of allowing us to be
perpetually observed. This digital architecturedescribed by New York University social media
scholar Clay Shirky as the connective tissue of society and by U.S. Secretary of State Hillary
Clinton as the new nervous system of the planethas been designed to transform us into
exhibitionists, forever on show in our networked crystal palaces. And, today, in an age of
radically transparent online communities like Twitter and Facebook, the social has become, in
Shirkys words, the default setting on the Internet, transforming digital technology from being
a tool of second life into an increasingly central part of real lifeAs WikiLeaks founder and selfappointed transparency tsar Julian Assange said, todays Internet is the greatest spying
machine the world has ever seen, with Facebook, he added, being the worlds most
comprehensive database about people, their relationships, their names, their addresses, their
locations, their communications with each other, and their relatives, all sitting within the United
States, all accessible to US Intelligence. But its not just Facebook that is establishing this
master database of the human race. As Clay Shirky notes, popular geo-location services such as
foursquare, Facebook places, Google Latitude, Plancast and the Hotlist, which enable us to
effectively see through walls and know the exact location of all our friends, are making society
more legible, thus allowing all of us to be read, in good Inspection-House fashion, like a book.
No wonder, then, that Katie Rolphe, a New York University colleague of Shirky, has observed that
Facebook is the novel we are all writing.

6
This contemporary mania with our own self-expression is what two leading American
psychologists, Dr. Jean
Twenge and Dr. Keith Campbell, have described as the narcissism epidemica self-promotional
madness driven, these two psychologists say, by our need to continually manufacture our own
fame to the world. The Silicon Valley based psychiatrist, Dr. Elias Aboujaoude, whose 2011
book, Virtually You, charts the rise of what he calls the self- absorbed online Narcissus, shares
Twenge and Campbells pessimism. The Internet, Dr. Aboujaoude notes, gives narcissists the
opportunity to fall in love with themselves all over again, thereby creating a online world of
infinite self-promotion and shallow web relationships. Many other writers share Aboujaoudes
concerns. The cultural historian Neal Gabler says that we have all become information
narcissists utterly disinterested in anything outside ourselves. Social network culture
medicates our need for self-esteem, adds best-selling author Neil Strauss, by pandering to win
followers.
7
Twenge, Campbell, Aboujaoude, Strauss and Franzen are all correct about this endless loop of
great exhibitionism an attention economy that, not uncoincidentally, combines a libertarian
insistence on unrestrained individual freedom with the cult of the social. Its a public exhibition of
self-love displayed in an online looking glass that New Atlantis senior editor Christine Rosen
identifies as the new narcissism and New York Times columnist Ross Douthat calls a desperate
adolescent narcissism. Everythingfrom communications, commerce and culture to gaming,
government and gamblingis going social. As David Brooks, Douthats colleague at The Times,
adds, achievement is redefined as the ability to attract attention. All we, as individuals, want to
do on the network, it seems, is share our reputations, our travel itineraries, our war plans, our
professional credentials, our illnesses, our confessions, photographs of our latest meal, our
sexual habits of course, even our exact whereabouts with our thousands of online friends.
8
Zuckerbergs five-year plan is to eliminate loneliness. He wants to create a world in which we will
never have to be alone again because we will always be connected to our online friends in
everything we do, spewing huge amounts of our own personal data as we do it. Facebook wants
to populate the wilderness, tame the howling mob and turn the lonely, antisocial world of
random chance into a friendly world, a serendipitous world (wrote) Times Lev Grossman.
9
Facebook, with its members investing over 700 billion minutes of their time per month on the
network, was the worlds most visited Web site in 2010 making up 9 percent of all online traffic.
By early 2011, 57 percent of all online Americans were logging onto Facebook at least once a
day, with 51 percent of all Americans over twelve years old having an account on the social
network and 38 percent of all the Internets sharing referral traffic emanating from Zuckerbergs
creation. By September 2011, more than 500 million people were logging onto Facebook each
day with its then almost 800 million active users being larger than the entire Internet was in
2004. Facebook is becoming mankinds own image.
10
Whether we like it or not, twenty-first-century life is increasingly being lived in public. Four out of
five college admissions offices, for example, are looking up applicants Facebook profiles before
making a decision on whether to accept them. A February 2011 human resources survey
suggested that almost half of HR managers believed it was likely that our social network profiles
are replacing our resumes as the core way for potential employers to evaluate us. The New York
Times reports that some firms have even begun using surveillance services like Social
Intelligence, which can legally store data for up to seven years, to collect social media
information about prospective employees before giving them jobs. In todays executive search
market, if youre not on LinkedIn, you dont exist, one job search expert told The Wall Street
Journal in June 2011. LinkedIn now even enables its users to submit their profiles as resumes,
thus inspiring one personal branding guru to announce that the 100 million member
professional network is about to put Job Boards (and Resumes) out of business.

11
Writing in 1948, Orwell imaginedIn principle a Party member had no spare time, and was
never alone except in bed, Orwell wrote in Nineteen-Eighty-four. It was assumed that when he
was not working, eating, or sleeping he would be taking part in some kind of communal
recreation: to do anything that suggested a taste for solitude, even to go for a walk by yourself,
was always slightly dangerous. There was a neologism for it in Newspeak: Ownlife, it was called,
meaning individualism and eccentricity. And there was another neologism in Newspeak:
facecrime, Orwell coined it. It was terribly dangerous to let your thoughts wander when you
were in any public place or within range of a telescreen, he wrote. The smallest thing could
give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself
anything that carried with it the suggestion of abnormality, of having something to hide. In any
case, to wear an improper expression on your face (to look incredulous when a victory was
announced, for example) was itself a punishable offence. There was even a word for it in
Newspeak: facecrime, it was called.Newspeaks facecrime has been turned on its head in
our world of endless tweets, check-ins and status updates. In Nineteen Eighty-four, it was a
crime to express yourself; today, it is becoming unfashionable, perhaps even socially
unacceptable not to express oneself.
12
As best-selling digital evangelists Don Tapscott and Anthony D. Williams argue in their 2010 book
MacroWikinomics, todays Internet represents a turning point in history. We are entering what
they call the age of networked intelligence, a titanic historic shift, they pronounce,
equivalent to the birth of the modern nation-state or the Renaissance. Mark Pincuss always-on
social dial tone, Tapscott and Williams argue, represents a platform for the networking human
minds that will enable us to collaborate and to learn collectively. Echoing Mark Zuckerbergs
five-year vision of social medias revolutionary impact on the broader economy, they predict that
politics, education, energy, banking, healthcare and corporate life will all be transformed by what
social utopians embrace as the openness and sharing of the networked intelligence age.
13
Botsford, Rogers, Tapscott, Williams and the rest of the social media quixotics are wrong that the
Internet is resulting in a new age of networked intelligence. In fact, the reverse may well be
true. From Zuckerbergs Facebook, Hoffmans LinkedIn and Stones Twitter to SocialEyes,
SocialCam, foursquare, ImageSocial, Instagram, Living Social and the myriad of other digital
drivers of John Doerrs third great wave, the network is creating more social conformity and herd
behavior. Men arent sheep, argued John Stuart Mill, the nineteenth centurys greatest critic of
Benthamite utilitarianism, in his 1859 defense of individual freedom On Liberty. Yet on the social
network, we seem to be thinking and behaving more and more like sheep, making what cultural
critic Neil Strauss describes as the need to belong, rather than genuine nonconformity, the
rule. While the Web has enabled new forms of collective action, it has also enabled new kinds of
collective stupidity, argues Jonas Lehrer, a contributing editor to Wired magazine and a bestselling writer on both neuroscience and psychology. Groupthink is now more more widespread,
as we cope with the excess of available information by outsourcing our beliefs to celebrities,
pundits and Facebook friends.
14
The more friends you have on Twitter or Facebook, therefore, the more potentially valuable you
become in terms of getting your friends to buy or do things. We manage our friends in the
social networking world in the same way as we manage our assets in the financial
marketplace. There is something Orwellian about the management speak on social networking
sites, notes the ever perceptive Christine Rosen, who adds that such terminology encourages
the bureaucratization of friendship.
15
In our digital age, we are, ironically, becoming more divided than united, more unequal than
equal, more anxious than happy, lonelier rather than more socially connected. A November 2009
Pew Research report about Social Isolation and New Technology, for example, found that
members of networks like Facebook, Twitter, MySpace and LinkedIn are 26 percent less likely to
spend time with their neighbors (thus, ironically, creating the need for social networks like
Nextdoor.com and Yatown that connect local communities). A 2007 Brigham Young University

research study, which analysed 184 social media users, concluded that the heaviest networkers
feel less socially involved with the community around them. While a meta-analysis of seventytwo separate studies conducted between 1979 and 2009 by the University of Michigans Institute
for Social Research showed that contemporary American college students are 40 percent less
empathetic than their counterparts in the 1980s and 1990s. Even our tweets are becoming
sadder, with a study made by scientists from the University of Vermont of 63 million Twitter users
between 2009 and 2011 proving that happiness is going downhill. Most troubling of all, a
fifteen-year study of 300 social media subjects by Professor Sherry Turkle, the director of MITs
Initiative on Technology and the Self, showed that perpetual networking activity is actually
undermining many parents relationship with their children. Technology proposes itself as the
architect of our intimacies, Turkle says about the digital architecture in which we are now all
living. But the truth, her decade and a half of research reveals, is quite the reverse. Technology,
she finds, has become our phantom limb, particularly for young people who, Turkle finds, are
sending up to 6,000 social media announcements a day and who have never either written nor
received a handwritten letter. No wonder, then, that teens have not only stopped using email,
but also no longer use the telephoneboth are too intimate, too private for a digital generation
that uses texting as a protection for their feelings.
16
In describing what she calls the practice of the protean self, MITs Turkle argues that we have
moved from multitasking to multi-lifing. But while we are forever cultivating our collaborative
self, she argues, what is being lost is our experience of being alone and privately reflecting on
our emotions. The end result, Turkle explains, is a perpetual juvenile, somebody she calls a
tethered child, the type of person who, like one of Turkles subjects in her study, believes that
if Facebook were deleted, Id be deleted too. The end result, Turkle explains, is a perpetual
juvenile, somebody she calls a tethered child, the type of person who, like one of Turkles
subjects in her study, believes that if Facebook were deleted, Id be deleted too.
17
So what is the real value of social media in repressive regimes? Twitter is a wonderful tool for
secret policeman to find revolutionaries, Friedman told me. His analysis reflects the so-called
Morozov Principle of Stanford University scholar Evgeny Morozov, whose 2010 book, The Net
Delusion: The Dark Side of Internet Freedom argues that social media tools are being used by
secret policemen in undemocratic states like Iran, Syria, and China to spy on dissidents. As
Morozov told me when he appeared on my TechcrunchTV show in January 2011, these
authoritarian governments are using the Internet in classic Benthamite fashionrelying on social
networks to monitor the behavior, activities and thoughts of their own citizens. In China,
Thailand, and Iran, therefore, the use of Facebook can literally be a facecrime and the Internets
architecture has become a vast Inspection-House, a wonderful tool for secret policemen who no
longer even need to leave their desks to persecute their own people.
18
Not only is social media being used by repressive regimes or organizations to strengthen their
hold on power, but it is also compounding the ever-widening inequalities between the influencers
and the new digital masses. If identity is the new currency and reputation the new wealth of the
social media age, then todays hypervisible digital elite is becoming a tinier and tinier proportion
of the populationOn Twitter, for example, only 0.05 percent of people have more than 10,000
followers with 22.5 percent of users accounting for 90 percent of activity, thus reflecting the
increasingly unequal power structure of an attention economy in which the most valuable
currency is being heard above the noise. Monopolies are actually even more likely in highly
networked markets like the online world, wrote Wired editor-in-chief Chris AndersonThe
inequalities between rich and poor nodes is even more exaggerated in the wake of 2009s Great
Recession. The people who use these [social media] tools are the ones with higher education,
not the tens of millions whose position in todays world has eroded so sharply, notes Time
magazine business columnist Zachary Karabell. Social media contribute to economic bifurcation.
The irony is that social media widen the social divide, making it even harder for the have-nots
to navigate. They allow those with jobs to do them more effectively and companies that are
profiting to profit more. But so far, they have done little to aid those who are being left behind.
They are, in short, business as usual.

19
The problem is that our ubiquitous online culture of free means that every social media
companyfrom Facebook to Twitter to geolocation services like foursquare, Hitlist, and Plancast
relies exclusively on advertising for its revenue. And its information about usJames Gleicks
vital principlethat is driving this advertising economy. As MoveOn.org president Eli Pariser,
another sceptic concerned about the real cost of all these free services, argues in his 2011
book The Filter Bubble, the race to know as much as possible about you has become the central
battle of the era for Internet giants like Google, Facebook, Apple and Microsoft.
20
As MoveOn.org president Eli Pariser, another sceptic concerned about the real cost of all these
free services, argues in his 2011 book The Filter Bubble, the race to know as much as possible
about you has become the central battle of the era for Internet giants like Google, Facebook,
Apple and Microsoft. It is fundamentally impossible for a digital advertising business to care
deeply about privacy, because the user is the only asset it has to sell. Even if the founders and
executives want to care about privacy, at the end of the day, they cant: the economic incentives
going the other direction are just too powerful, Michael Fertik, the Silicon Valleybased CEO of
Reputation.com, a company dedicated to protecting our online privacy, told me. Fertiks
argument is reiterated by the media theorist and CNN columnist Douglas Rushkoff who explains
that rather than being Facebooks customers, we are the product.
21
Bowling Alone syndromea reference to the communitarian theories of Harvard University
sociologist Robert Putnam, whose highly influential and best-selling Bowling Alone regards the
digital network as the solution to what he considers as the crisis of local community. Writing, in
2000only a couple of years after @quixotic created the first social media businessPutnam
sees electronic media as the twenty-first-century means of reinventing community engagement.
Let us find ways to ensure that by 2010 Americans will spend less leisure time sitting passively
alone in front of glowing screens and more time in active connection with our fellow citizens, he
argued with communitarian fervor. Let us foster new forms of electronic entertainment and
communication that reinforce community engagement rather than forestalling it.
22
This intellectual obsession with the social, an obsession with sharingwhat today, as the arc of
information flow bends toward ever greater connectivity, is fashionably called a meme (but is,
in many ways, a virus)can be seen across many different academic disciplines. The concepts of
togetherness and sharing have acquired such religious significance that, in stark contrast with
the research of Oxford Universitys Baroness Susan Greenfield, some scientists are now
discovering its centrality in the genetic make-up of the human condition. One
neuroeconomist, a certain Dr. Paul Zak from the California Institute of Technology, has
supposedly found that social networking activates the release of generosity-trust chemical in
our brains. Larry Swanson and Richard Thompson from the University of Southern California are
even discovering that the brain resembles a interconnected communitythereby triggering
the ridiculous headline: Brain works more like internet than top down company.
23
The future is already here, William Gibson observed in 1993, its just unevenly distributed.
One version of the future, at least our social future, may have arrived, a handful of years after
Gibson first made this prescient remark, at the very end of the twentieth century.
24
This future is called a Super Sad True Love Story. It is imagined by satirist Gary Shteyngart, the
author of a creepy 2010 novel about a dystopian future in which we all own a chic little device
called an Apparat that quantifies and ranks the massive amounts of personal data being
generated by our real identities. Shteyngart explains his data dystopia in which we all live in
public: Everyone has this device called the Apparat, which they wear either tucked into their
pocket or usually as a pendant. The moment you enter a room everyone judges you. So it has
whats called Rate Me Plus technology. So youre rated immediately. Everyone can chip in and
rate everyone else, and everyone does. When he appeared on my TechcrunchTV show in July
2011, Shteyngart described this world as William Gibson land. Its a place where our

personalities are quantified in universally accessible, real-time lists akin to Internet reputation
networks like Hashable or Kred. Mystery, privacy and secrecy will have all been eliminated in this
transparent marketplace. Todays reputation stock market Empire Avenue will have replaced Wall
Street as the key exchange of value. It will be a pure reputation economy, a marketplace of
mirrors a perfect data market in how others see us.
25
As John Stuart Mill argues in On Liberty, government exists to protect us from others rather than
from ourselves and the reality, for better or worse, is that once a photo, an update or a tweet is
publicly published on the network, it becomes de facto public property. So, without wishing to
sound too much like the uber-glib Eric Schmidt, the only way to really protect ones own privacy
is by not publishing anything in the first place.
26
The European Union has been much more aggressive than the United States government in
pushing for privacy rights over social networks. On the all-important issue of online tracking by
social media companies, for example, European privacy regulators have been pushing to
establish an arrangement in which consumers could only be tracked if they actively opt in and
permit marketers to collect their personal data. Europeans have also been more aggressive in
pushing back against the leading Web 3.0 companies. In April 2011, for example, the Dutch
government threatened Google with fines of up to $1.4 million if it continued to ignore dataprotection demands associated with its Street View technology. Apple and Google face much
tighter regulation in Europe with the EU classifying the location information that they have been
collecting from their smartphones as personal data. European Union data protection regulators
have aggressively scrutinized Facebooks May 2011 rollout of its facial recognition software that
reveals peoples identities without their permissionEU justice commissioner Viviane Reding is
even intending social networks to establish a right to be forgotten option that would allow
users to destroy data already published on the network. I want to explicitely clarify that people
shall have the rightand not only the possibilityto withdraw their consent to data processing,
Reding told the EU parliament in March 2011."
27
According to the executive editor of The New York Times, friendship has become a kind of drug
on the Internet, the crack cocaine of our digital age. Last week, my wife and I told our 13-yearold daughter she could join Facebook, confessed The New York Times Bill Keller in May 2011.
Within a few hours she had accumulated 171 friends, and I felt a little as if I had passed my
child a pipe of crystal meth. A June 2011 Pew Research Center study of over two thousand
Americans reported that electronically networked people like Kellers daughter saw themselves
as having more close friends than those of usthose weirdo outcasts according to one
particularly vapid social media commentatorwho arent on Facebook or Twitter. The Pew report
found that the typical Facebook user has 229 friends (including an average of 7 percent that they
hadnt actually met) on Mark Zuckerbergs network and has more close relationships than the
average American. But this June 2011 Pew study made no attempt to define or calibrate the idea
of friendship, treating each one quantatively, like a notch on a bedpost, and presenting
Facebook and Twitter as, quite literally, the architects of our intimacies. What this survey failed
to acknowledge is that human beings arent simply computers, silicon powered devices with
infinitely expandable hard drives and memories, who can make more friends as a result of
becoming more and more networked. So how many real friends should we have? And is there a
ceiling to the number of friendships that we actually can have?
28
A couple of miles north of the Oxford Mal hotel sits the gray-bricked home of Oxford Universitys
Institute of Cognitive and Evolutionary Anthology. It is here, in the nondescript academic setting
of a north Oxford suburb, that we find a man who has determined how many friends we really
need. Professor Robin Dunbar, the director of this institute, is an anthropologist, evolutionary
psychologist and authority on the behavior of primates, the biological order that includes
monkeys, apes and humans. And he has become a social media theorist too, best known for
formulating a theory of friendship dubbed Dunbars Number. The big social revolution in the
last few years has not been some great political event, but the way our social world has been
redefined by social networking sites like Facebook, MySpace and Bebo, Dunbar explains his

eponymous number. This social revolution, he says, attempts to break through the constraints
of time and geography to enable uber-connected primates like @scobleizer to establish online
friendships with tens of thousands of other wired primates. So why do primates have such big
brains? Dunbar asks, rhetorically. Their large brains, he says, borrowing from a theory known as
the Machiavellian intelligence hypothesis, are the result of the complex social world in which
primates live. Its the complexity of their social relations defined by their tangled and
interdependent personal intimacies, Dunbar argues, that distinguishes primates from every
other animal. And as the most successful and widely distributed member of the primate order,
he goes on, humans brains have evolved most fully of all because of the intricate complexity of
our intense social bonds. Memory and forgetting are the keys to Dunbars theory about human
sociability. Youll remember that The New York Times Paul Sullivan suggested that the Internet is
like an elephant because it never forgets. But what really distinguishes animals like elephants
from primates, Robin Dunbar explains, is that they use their knowledge about the social world in
which they live to form more complex alliances with each other than other animals. Thus
primates have a lot more to remember about our social intimacies than elephantswhich may
be one reason why humans forget things and elephants supposedly dont. For better or worse,
nature hasnt come up with a version of Moores Law that could double the size and memory
capacity of our brain every two years. Thus, while our big brains are the result of our complex
social relationships, they are still confined by their limited memories. And its our biological
inability to remember the intricate social details of large communities, Robin Dunbar explains,
that limits our ability to make intimate friendships. We can only remember 150 individuals,
Dunbar says, or only keep track of all the relationships involved in a community of 150. That is
Dunbars Numberour optimal social circle, for which we, as a species, are wired.
29
As I sat upstairs in The Jeremy Bentham nursing my beer and thinking about John Stuart Mill,
what struck me is how acutely relevant On Liberty is today, in an age also being revolutionized
by a pervasive connective technology. This is a world, according to Mark Zuckerberg, in which
education, commerce, health and finance are all becoming social. Its a connected world defined
by billions of smart devices, by real-time lynch mobs, by tens of thousands of people
broadcasting details of a strangers sex life, by the bureaucratization of friendship, by the groupthink of small brothers, by the elimination of loneliness, and by the transformation of life itself
into a voluntary Truman Show. Most of all, its a world in which many of us have forgotten what it
means to be human. But here I fear I am becoming nostalgic, writes the novelist Zadie Smith,
who along with Jonathan Franzen and Gary Shteyngart is amongst the most articulate
contemporary critics of social media. I am dreaming of a Web that caters to a person who no
longer exists. A private person, a person who is a mystery, to the world andwhich is more
importantto herself. Person as mystery: This idea of personhood is certainly changing, perhaps
has already changed.

In-class resources for Thu Dec 10 (class 22)


How does the internet affect parenting?
Digerati video and reading:
(discussion leader: x)
Digerati
danah boyd
@zephoria
http://www.danah.org
Video
We will watch and discuss this in class.
danah boyd on PBS Frontline (2:02)
http://www.pbs.org/wgbh/pages/frontline/digitalnation/relationships/identity/famous-for-15-minutes.html?play

Reading
We will read and discuss this in class.
Its Complicated: the social lives of networked teens
(danah boyd, 2014)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
What is social medias effect on parenting?
danah boyd

Its Complicated: the social lives of networked teens


Defining new norms
Christine Rosen, The New Atlantis (2007)
Enthusiasts of social networking argue that these sites are not merely entertaining; they also
edify by teaching
users about the rules of social space. As Danah Boyd, a graduate student studying social
networks at the
University of California, Berkeley, told the authors of MySpace Unraveled, social networking
promotes
informal learning.... Its where you learn social norms, rules, how to interact with others,
narrative, personal
and group history, and media literacy. New technologies are creating new normscell phones
ring during
church sermons; blaring televisions in doctors waiting rooms make it difficult to talk quietly
and new norms
must develop to replace the old. What cues are young, avid social networkers learning about
social space?
What unspoken rules and communal norms have the millions of participants in these online
social networks
internalized, and how have these new norms influenced their behavior in the offline world?

Parasocial Relationships
Clive Thompson, The New York Times (2008)
It is also possible, though, that this profusion of weak ties can become a problem. If youre
reading daily
updates from hundreds of people about whom theyre dating and whether theyre happy, it
might, some critics
worry, spread your emotional energy too thin, leaving less for true intimate relationships.
Psychologists have
long known that people can engage in parasocial relationships with fictional characters, like
those on TV
shows or in books, or with remote celebrities we read about in magazines. Parasocial
relationships can use up
some of the emotional space in our Dunbar number, crowding out real-life people. Danah Boyd, a
fellow at
Harvards Berkman Center for Internet and Society who has studied social media for 10 years,
published a
paper this spring arguing that awareness tools like News Feed might be creating a whole new
class of
relationships that are nearly parasocial peripheral people in our network whose intimate
details we follow
closely online, even while they, like Angelina Jolie, are basically unaware we exist.
The filter bubble diet
Eli Pariser, The Filter Bubble: What the Internet is Hiding From You (2011)
One of the best ways to understand how filters shape our individual experience is to think in
terms of our information diet. As sociologist danah boyd said in a speech at the 2009 Web 2.0
Expo: Our bodies are programmed to consume fat and sugars because theyre rare in nature....
In the same way, were biologically programmed to be attentive to things that stimulate: content
that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive.
If were not careful, were going to develop the psychological equivalent of obesity. Well find
ourselves consuming content that is least beneficial for ourselves or society as a whole. Just as
the factory farming system that produces and delivers our food shapes what we eat, the
dynamics of our media shape what information we consume. Now were quickly shifting toward a

regimen chock-full of personally relevant information. And while that can be helpful, too much of
a good thing can also cause real problems. Left to their own devices, personalization filters serve
up a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our
desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark
territory of the unknown.

Sunday Story #12


Due Sun Dec 13 (before midnight)
Prompt
parenting video with parents

Homework for Tue Dec 15 (class 23)


Is privacy dead?
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: Is privacy dead?
Video
Watch the assigned video on Vimeo.
Terms and Conditions (Vimeo VOD sign in bhsecbooks@gmail.com
cutandpaste)
Watch at least 10 minutes
https://vimeo.com/ondemand/tacma/75993946
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
How should we balance between surveillance, privacy and free speech?
link

The Big Debate


John Gleick, What Just Happened: A Chronicle from the Information Frontier (2003)
'There's a very important and long-term debate taking place right now about technologies of
privacy in the next century,'' says Marc Rotenberg, director of the Electronic Privacy Information
Center in Washington. ''Privacy will be to the information economy of the next century what
consumer protection and environmental concerns have been to the industrial society of the 20th
century.''
The Panopticon
Siva Vaidhyanathan, The Googlization of Everything (and why we should worry) (2011)
The original Panopticon, conceived by Jeremy Bentham, was a design for a circular prison with a
central watchtower, in which all the inmates would behave because they would assume that
they were being observed at all times. Foucault argued that state programs to monitor and
record our comings and goings create imaginary prisons that lead citizens to limit what they do
out of fear of being observed by those in power. The gaze, the theory goes, works as well as iron
bars to control the behavior of most people. Those who write about privacy and surveillance
usually cant help invoking the Panopticon to argue that the great harm of mass surveillance is
social control.
Data Aggregation
Daniel Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security (2010)
One such harm, for example, which I call aggregation, emerges from the fusion of small bits of
seemingly innocuous data. When combined, the information becomes much more telling. By
joining pieces of information we might not take pains to guard, the government can glean
information about us that we might indeed wish to conceal. For example, suppose you bought a
book about cancer. This purchase isnt very revealing on its own, for it just indicates an interest
in the disease. Suppose you bought a wig. The purchase of a wig, b itself, could be for a number
of reasons. But combine these two pieces of information, and now the inference can be made
that you have cancer and are undergoing chemotherapy.
"Clickstream Data"
Daniel Solove, The Digital Person: Technology and Privacy in the Information Age (2008)
Clickstream data is a trail of how a user navigates throughout the web by clicking on various
links. It enables the website to calculate how many times it has been visited and what parts are
most popular. With a way to
connect this information to particular web users, marketers can open a window into peoples
minds. This is a unique vision, for while marketers can measure the size of audiences for other
media such as television, radio, books, and magazines, they have little ability to measure
attention span. Due to the interactive nature of the Internet, marketers can learn how we
respond to what we hear and see. A website collects information about the way a user interacts
with the site and stores the information in its database. This information will enable the website
to learn about the interests of a user so it can better target advertisements to the user. For
example, Amazon.com can keep track of every book or item that a customer browses but does
not purchase. To connect this information with particular users, a company can either require a
user to log in or it can secretly tag a user to recognize her when she returns. This latter form of
identification occurs through what is called a cookie. A cookie is a small text file of codes that
is deployed into the users computer when she downloads a web page. 60 Websites place a
unique identification code into the cookie, and the cookie is saved on the users hard drive.
When the user visits the site again, the site looks for its cookie, recognizes the user, and locates
the information it collected about the users previous surfing activity in its database. Basically, a
cookie works as a form of high-tech cattle-branding.
RFID Tags
Hal Abelson, Blown to Bits: Life, Liberty and the Pursuit of Happiness After the Digital Explosion
(2008)

A Radio Frequency Identification tagRFID, for shortcan be read from a distance of a few feet.
Because RFID data is transferred by radio waves rather than visible light, the tags need not be
visible to be read, and the sensor need not be visible to do the reading. RFID tags are simple
devices. They store a few dozen bits of information, usually unique to a particular tag. Most are
passive devices, with no batteries, and are quite small. The RFID includes a tiny electronic chip
and a small coil, which acts as a two-way antenna. A weak current flows through the coil when
the RFID passes through an electromagnetic fieldfor example, from a scanner in the frame of a
store, under the carpet, or in someones hand. This feeble current is just strong enough to power
the chip and induce it to transmit the identifying information. Because RFIDs are tiny and require
no connected power source, they are easily hidden. They can be almost undetectable.
The Government and Email Surveillance
John Freeman, The Tyranny of Email (2009)
The scope of the United States National Security Agencys (NSA) listening capacities is truly
awesome. As James Bamford describes in A Pretext for War, dozens of listening posts around
the world each sweep in as many as two million phone calls, faxes, e-mail messages, and other
types of communications per hour. Most alarming for many Americans is the fact that the
communications companies are helping them. According to a federal statute called the
Communications Assistance for Law Enforcement Act (CALEA), passed in 1994, communications
companies must design their facilities so that their network can be easily monitored. As Bamford
explains in The Shadow Factory, it even requires the company to install the eavesdropping
devices themselves if necessary and then never reveal their existence.

Worms that sniff


Lawrence Lessig, Code and Other Laws of Cyberspace, version 2.0 (2006)
A worm is a bit of computer code that is spit out on the Net and works its way into the systems
of vulnerable computers. It is not a virus because it doesn't attach itself to other programs and
interfere with their operation. It is just a bit of extra code that does what the code writer says.
The code could be harmless, simply sitting on someone's machine. Or it could be harmful,
corrupting files or doing other damage that its author commands.
Imagine a worm designed to do good (at least in the minds of some). Imagine that the code
writer is the FBI and that the FBI is looking for a particular document belonging to the National
Security Agency (NSA). Suppose that this document is classified and illegal to possess without
the proper clearance. Imagine that the worm propagates itself on the Net, finding its way onto
hard disks wherever it can; once on a computer's hard disk it scans the entire disk. If it finds the
NSA document, it sends a message back to the FBI saying as much. If it doesn't, it erases itself.
Finally, assume that it can do all this without interfering with the operation of the machine. No
one would know it was there; it would report back nothing except that the NSA document was on
the hard disk.
Is the worm constitutional? This is a hard question that at first seems to have an easy answer.
The worm is engaging in a government-initiated search of citizen's disks. There is no reasonable
suspicion (as the law ordinarily requires) that the disk holds the document for which the
government is searching. It is instead a generalized suspicionless search of private spaces by the
government. From the standpoint of the Constitutionthe Fourth Amendment in particularyou
don't get any worse than that.
The fourth amendment was written against the background of just this sort of abuse. Kings
George II and George III would give officers a general warrant authorizing them to search
through private homes looking for evidence of a crime. No suspicion was needed before the
officer ransacked your house, but because he had a warrant, you were not able to sue the officer
for trespass. The aim of the amendment was to require at least suspicion so that the burden of
the search fell on a reasonably chosen class.
But is the worm really the same as the King's general search? There is one important difference:
unlike the victims of the general searches that the framers of our constitution were concerned

about, the computer user never knows that his or her disk is being searched by the worm. With
the general search, the police were breaking into a house and rummaging through private stuff.
With the worm, it is a bit of computer code that does the breaking and it can only see one thing.
The code can't read private letters; it doesn't break down doors and it doesn't interfere with
ordinary life. And the innocent have nothing to fear.
The worm is silent in a way that King George's troops were not. It searches perfectly and
invisibly, discovering only the guilty This difference complicates the constitutional question.
The worm's behavior is like a generalized search in that it is a search without suspicion, but it is
unlike a generalized search in that it creates no disruption of ordinary life. The framers of the
constitution do not distinguish between these two very different protections. It is we, instead,
who must choose.
Can iPhone Data Be Searched?
Brian Chen, Always On (2011)
Imagine that a twenty-two-year-old Oakland man named Phil gets pulled over for running a stop
sign. A police officer approaches the passenger window and determines that Phil looks suspicious
and decides to arrest him
because running a stop sign is an arrestable offense. While patting down Phil, the officer finds
a pack of cigarettes and an iPhone in his pockets. Under the search-to-arrest doctrine, the officer
is entitled to open the pack of cigarettes and search it without a warrant or even a probable
cause to believe there is anything illegal inside. Further, under the same doctrine, the officer also
has a right to a warrantless search of Phils iPhone. The iPhone (and any smartphone, for that
matter) can be considered the digital equivalent of a closed container, which police officers can
search thoroughly during an arrest, according to Adam Gershowitz, an associate professor at
South Texas College. Thats because even though society and technology have transformed
dramatically over the past few decades, the Fourth Amendment, which guards citizens against
unreasonable searches and seizures, has remained static.
This is not a new observation. For years courts have accepted digital information as evidence;
they see no conceptual difference between physical containers and gadgets containing data.
Before the iPhone, police officers used information retrieved from pagers and conventional cell
phones as admissible evidence against criminals. However, with media-rich, all-in-one portables
such as the iPhone, the situation changes tremendously. Simply by searching an iPhone, police
officers can rightfully gain access to a treasure trove of personal information. In addition to text
messages, contacts, and call histories, an iPhone holds far more pictures than could be stored on
a conventional cell phone and displays them in much clearer detail. Furthermore, the data
contained inside third-party apps can potentially tell a persons life story
1984 and Big Brother
James Gleick, What Just Happened: A Chronicle from the Information Frontier (2003)
For much of the 20th century, 1984 was a year that belonged to the future -- a strange, gray
future at that. Then it slid painlessly into the past, like any other year. Big Brother arrived and
settled in, though not at all in the way George Orwell had imagined. Underpinning Orwell's 1948
anti-utopia -- with its corruption of language and history, its never-ending nuclear arms race and
its totalitarianism of torture and brainwashing -- was the utter annihilation of privacy. Its single
technological innovation, in a world of pig iron and pneumatic tubes and broken elevators, was
the telescreen, transmitting the intimate sights and sounds of every home to the Thought Police.
BIG BROTHER IS WATCHING YOU. ''You had to live Orwell wrote, ''in the assumption that every
sound you made was overheard, and, except in darkness, every movement scrutinized.''
How might technology threaten our precept of innocent before proven guilty?
How might technology compromise our constitutional protections against excessive
punishment?
4/20 Day at the University of Colorado
Daniel Solove, The Future of Reputation (2008)

In the ordinary criminal justice process, a person is innocent until proven guilty. The world of
shaming works differently, as people are punished without a hearing. In one incident, the
University of Colorado used a website to post surveillance photos of students and other
individuals it wanted to identify for smoking marijuana on Farrand Field. It was long a tradition at
the university for students to smoke pot on Farrand Field each year on April 20a party called
420 Day. The university wanted to stamp out this tradition, so it created a website on which it
posted pictures of 150 students captured in the act of smoking pot. According to the website:
The University is offering a reward for the identification of any of the individuals pictured below.
After reviewing the photos (click on a photo for a larger image), you may claim the reward by
following the directions below:
Contact the UCPD Operations section at (303) 492-8168
Provide the photo number and as much information as you have about the individual.
Provide your name and contact information.
If the identity is verified to be correct, you will be paid a $50 reward for every person
identified.
The reward will be paid to the first caller who identifies a person below, multiple rewards will
not be paid for individuals listed below.
The website consisted of a grid of thumbnail photos that people could click on to get larger, highresolution images. Pictures of students who were identified were stamped with the word
IDENTIFIED in large capital letters. The Farrand Field website purported to investigate
trespassers on the field. But it really appeared to be an attempt to use shaming to try to snuff
out the embers of 420 Day. The Farrand Field website exposed students engaging in a minor
infraction to being forever memorialized as drug users, and it did so even before students were
convicted of any wrongdoing. Some of the students might have been smoking cigarettes; some
might have just been there with friends. But their inclusion on the website implicated them.
The American Coalition of Life Activists
Daniel Solove, The Future of Reputation (2008)
One of the earliest attempts at Internet vigilantism was the website known as the Nuremberg
Files. Created in 1997 by Neal Horsley, the website listed the names and personal information of
abortion doctors and their families. This was part of a campaign by a group known as the
American Coalition of Life Activists (ACLA) to
terrorize abortion doctors. The website included data on more than two hundred individuals,
including names, addresses, photographs, drivers license numbers, and information about
family members, such as the schools their children attended. The name of the site alluded to the
Nuremberg trials of Nazi officials following World War II. The site listed doctors who had been
wounded by antiabortion activists in grey and those killed with a line through them. Another part
of the website listed the names of clinic owners and workers, and spouses of abortion doctors.
After Horsley created the website in January 1997, two abortion doctors were shot at their homes
that year. In 1998 an abortion clinic in Alabama was bombed and another doctor was killed by
sniper fire at his home in New York. Shortly afterward, a strikethrough was placed through his
name on the Nuremberg Files website.
Planned Parenthood and a group of doctors sued, contending that the website caused them to
live in fear, to require police protection, and to wear bulletproof vests. The case went to trial in
1999. One doctor stated that he switched his driving route to work and rode in a separate car
from the rest of his family. Every time I get a package, it makes me nervous, a doctor declared.
Its a creepy thing to have to live with, thinking every time, Is this something I ordered or is it a
bomb? One doctor began to wear wigs to conceal herself in public. A jury awarded the doctors
more than one hundred million dollars in damages. The case was appealed, with Horsley and the
ACLA contending that the verdict violated their right to free speech. The court of appeals
affirmed, concluding that the website involved threats of violence with the intent to intimidate
rather than articulating a position to de- bate.
Gae-ttong-nyue

Daniel Solove, The Future of Reputation (2008)


On a subway train in South Korea, a young womans small dog pooped in the train. Other
passengers asked her to clean it up, but she told them to mind their own business. Thats when
it moved over to cyberspace and became even uglier. Someone took photos of her and posted
them on a popular Korean blog. Within hours, she was labeled gae-ttong-nyue (dog shit girl) and
her pictures and parodies were everywhere. Within days, her identity and her past were
revealed. Requests for information about her parents and relatives started popping up and
people started to recognize her by the dog and the bag she was carrying as well as her watch,
clearly visible in the original picture.
Across the Internet, people made posters with the girls photograph, fusing her picture with a
variety of other images. The dog poop girl story quickly migrated to the mainstream media,
becoming national news in South Korea. As a result of her public shaming and embarrassment,
the dog poop girl dropped out of her university.
The story of the dog poop girl wasnt known in the United States until Don Park wrote about it in
his blog, Don Parks Daily Habit. It became even more popular when the blog BoingBoing
discussed the story. BoingBoing receives nearly ten million visits per monthmore than the
circulations of many newspapers and magazines. In no time, newspapers and websites around
the world were discussing the story.
The story of the dog poop girl raises a number of intriguing issues about the Internet, privacy,
norms, and life in the Information Age. Not picking up your dogs poop is bad behavior in most
peoples books, but was the reaction to her transgression appropriate? We all have probably
engaged in rude behavior or minor wrongdoing. But is it going too far to transform the dog poop
girl into a villain notorious across the globe?
Under existing notions, privacy is often thought of in a binary waysomething is either private
or public. According to the general rule, if something occurs in a public place, it is not private.
But a more nuanced view of privacy suggests that this case involved taking an event that
occurred in one context and significantly altering its natureby making it permanent and
widespread. The dog poop girl would have been just a vague image in a few peoples memories
if it hadnt been for the photo entering cyberspace and spreading around faster than an
epidemic.
Dog Poop Girl will not be forgotten. Thats what the Internet changes. Whereas before the girl
would have been remembered merely by a few as just some woman who wouldnt clean up dog
poop, now her image and identity are eternally preserved in electrons. Forever, she will be the
dog poop girl"; forever, she will be captured in Googles unforgiving memory; and forever, she
will be in the digital doghouse for being rude and inconsiderate. The dog poop girls behavior
was certainly wrong, but we might not know the whole story behind the incident to judge her
appropriately. And should peoples social transgressions follow them on a digital rap sheet that
can never be expunged?
Dontdatehimgirl.com
Hal Abelson, Blown to Bits: Life, Liberty and the Pursuit of Happiness After the Digital Explosion
(2008)
New participatory websites create even bigger opportunities for information-sharing. If you are
about to go on a blind date, there are special sites just for that. Take a look at
www.dontdatehimgirl.com, a social networking
site with a self-explanatory focus. When we checked, this warning about one man had just been
posted, along with his name and photograph: Compulsive womanizer, liar, internet cheater;
pathological liar who cant be trusted as a friend much less a boyfriend. Total creep! Twisted and
sickneeds mental help. Keep your daughter away from this guy! Of course, such information
may be worth exactly what we paid for it. There is a similar site, www.platewire. com, for reports
about bad drivers. If you are not dating or driving, perhaps youd like to check out a
neighborhood before you move in, or just register a public warning about the obnoxious revelers
who live next door to you. If so www.rottenneighbor.com is the site for you.

Which of these parties present the greatest threat to personal privacy: corporations, our
friends/family, or accidental data leaks?
Corporations and Email Surveillance
John Freeman, The Tyranny of Email (2009)
Over 35 percent of the workforce has their Internet or e-mail under constant surveillance.
Employers spend hundreds of millions of dollars each year on employee-monitoring software.
Thanks to the Sarbanes-Oxley Act of 2002 and other regulations, publicly traded companies are
required to archive their e-mail. Europe still has strong privacy protections for its employees, but
many U.S. employers in the private sector, as long as they have an established policy and have
put it into writing, can keep a close eye on what their employees send and receive, and where
they point their browsers. Some companies say they do it to control the information that
employees send through the corporate network, wrote Matt Villano in The New York Times.
Other companies do it to make sure employees stay on task, or as a measure of network
security. Other companies monitor e-mail to see how employees are communicating with
customers.
Your Partner and Email Surveillance
John Freeman, The Tyranny of Email (2009)
Lovers and spouses do it, too. A survey done in Oxford revealed that one in five people had spied
on their partners e-mails or texts. Cheaters are constantly caught. Spurned lovers steal each
others smartphones, wrote Brad Stone. Suspicious spouses hack into each others e-mail
accounts. They load surveillance software onto the family PC, sometimes discovering shocking
infidelities. In one case, Stone described a woman who was convinced her husband was
strayinghe was far too obsessed with his BlackBerry. On his birthday she drew him a bubble
bath and rifled through his handheld while he was soaking, discovering that he did have a bit on
the side and planned to meet her that night. All this evidence gleaned from glowing devices
winds up in divorce proceedings, where the electronic paper trail becomes the knife you stick in
your former partners back. I do not like to put things on e- mail, said one divorce lawyer.
Theres no way its private. Nothing is fully protected once you hit the send button.
Oops
Andrew Keen, The Cult of the Amateur (2008)
On August 6, 2006, AOL leaked the search data of 658,000 people. Critics immediately dubbed
this information leak Data Valdez, after the 1989 Exxon Valdez oil tanker spill. Twenty-three
million of the AOL users most private thoughtson everything from abortions and killing ones
spouse to bestiality and pedophiliawere spilled on the Internet to the world without their
knowledge or permission. It was the equivalent of the Catholic Church mailing out 658,000
confessions to its worldwide parishioners. Or the KGB, the Soviet secret police, throwing open
their surveillance files and broadcasting them on national television. The information in these
AOL files is a twenty-first-century version of Notes from Undergroundreplete with information
that reveals us at our most vulnerable, our most private, our most shameful, our most human.
They include every imaginable query, from how to kill your wife and I want revenge for my
wife to losing your virginity, can you still be pregnant even though your period came? and
can you not get pregnant by having sex without a condom? My goodness, its my whole
personal life, a sixty-two-year-old widow from Georgia told the New York Times, horrified, when
she learned that her personal life had been splayed across the Internet. I had no idea somebody
was looking over my shoulder.
Self-disclosure => Self Awareness?
Clive Thompson, The New York Times (2008)
It is easy to become unsettled by privacy-eroding aspects of awareness tools. But there is
another quite different result of all this incessant updating: a culture of people who know
much more about themselves. Many of the avid Twitterers, Flickrers and Facebook users I
interviewed described an unexpected side-effect of constant self-disclosure. The act of stopping
several times a day to observe what youre feeling or thinking can become, after weeks and

weeks, a sort of philosophical act. Its like the Greek dictum to know thyself, or the therapeutic
concept of mindfulness. (Indeed, the question that floats eternally at the top of Twitters Web site
What are you doing? can come to seem existentially freighted. What are you doing?) Having
an audience can make the self-reflection even more acute, since, as my interviewees noted,
theyre trying to describe their activities in a way that is not only accurate but also interesting to
others: the status update as a literary form. Laura Fitton, the social-media consultant, argues
that her constant status updating has made her a happier
person, a calmer person because the process of, say, describing a horrid morning at work forces
her to look at it objectively. It drags you out of your own head, she added. In an age of
awareness, perhaps the person you see most clearly is yourself.

New Risks
Evgeny Morozov, bostonreview.net (2009)
From a national security perspective, cyber-attacks matter in two ways. First, because the backend infrastructure underlying our economy (national and global) is now digitized, it is subject to
new risks. Fifty years ago it would have been hardperhaps impossible, short of nuclear attack
to destroy a significant chunk of the U.S. economy in a matter of seconds; today all it takes is
figuring out a way to briefly disable the computer systems that run Visa, MasterCard, and
American Express. Fortunately, such massive disruption is unlikely to happen anytime soon. Of
course there is already plenty of petty cyber-crime, some of it involving stolen credit card
numbers. Much of it, however, is due to low cyber-security awareness by end-users (you and
me), rather than banks or credit card companies.
Second, a great deal of internal government communication flows across computer networks,
and hostile and not-so-hostile parties are understandably interested in what is being said.
Moreover, data that are just sitting on ones computer are fair game, too, as long as the
computer has a network connection or a USB port. Despite the cyber prefix, however, the
basic risks are strikingly similar to those of the analog age. Espionage has been around for
centuries, and there is very little we can do to protect ourselves beyond using stronger
encryption techniques and exercising more caution in our choices of passwords and Wi-Fi
connections.
Passive Privacy vs. Aggressive Privacy
Passive privacy is the kind elegantly described by the Fourth Amendment -- ''the right of the
people to be secure in their persons, houses, papers, and effects, against unreasonable searches
and seizures.'' We do have a lot of papers and effects these days.
Aggressive privacy implies much more. Telephone regulatory commissions have listened to
arguments that people have a right to remain anonymous, hiding their own numbers when
placing telephone calls. On the Internet, surprising numbers of users insist on a right to hide
behind false names while engaging in verbal harassment or slander.
James Gleick, What Just Happened: A Chronicle from the Information Frontier (2003)
Traceable Anonymity
One way to strike a balance is to enforce traceable anonymity. In other words, we preserve the
right for people to speak anonymously, but in the event that one causes harm to another, weve
preserved a way to trace who the culprit is. A harmed individual can get a court order to obtain
the identity of an anonymous speaker only after demonstrating genuine harm and the need to
know who caused that harm.
Traceable anonymity is for the most part what currently exists on the Internet. Many people use
the term anonymity rather impreciselyto refer to both anonymous speech (no name or
identifier attached) and pseudonymous speech (using a pen name).
Suppose you write an anonymous comment on my blog saying something bad about me. At a
minimum, I will know the IP address of the computer you posted from. I might even have

information about the organization that assigned you your IP address. Thus I will know your ISP
or the company where you work from and the city you were in when you posted. This is how
Brandt traced the Seigenthaler defamer. If you post from work, your employer has information
about which specific computer your post came from, and the comment may be traced back to
your office computer. If you post from home, your ISP can connect your IP address to your
account information. Thus even when youre anonymous, you can be tracked down.
Daniel Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security (2011)

In-class resources for Tue Dec 15 (class 23)


Is Privacy dead?
Digerati video and reading
(discussion leader: x)
Digerati
Edward Snowden
https://en.wikipedia.org/wiki/Edward_Snowden
Video
We will watch and discuss this in class.
Citizenfour trailer (1:28)
https://www.youtube.com/watch?v=XiGwAvd5mvM
Reading
We will read and discuss this in class.
No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State
(Glenn Greenwald, 2014)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
How are corporations buying, selling and using our data for profit?
How is facial recognition eroding privacy?
Privacy and Google glasses
Google/Facebook vs. European regulators
Privacy and government surveilliance
Privacy and law enforcement
Privacy and hacking
Schools and privacy
We are to blame for our loss of privacy
Privacy and kids
Edward Snowden

No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State

Homework for Thu Dec 17 (class 24)


Citizenfour
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: Facial recognition and Anonymity
Video
Watch the assigned video on Zaption and post a one-sentence response.
Online Privacy PBS (8:10)
http://zapt.io/tkc388my
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Online discussion
Respond to this prompt on Canvas:
Cloud computing vs/and facial recognition
link

Give us 14 images of someone and we'll identify that person with 95% accuracy
Eli Pariser, The Filter Bubble: What the Internet is Hiding from You (2011)
Give us 14 images of you, Googles CEO told a crowd of technologists at the Techonomy
Conference in 2010, and we can find other images of you with ninety-five percent accuracy.
Sooner or later, mass face recognition perhaps even in real time, which would allow for
recognition on security and video feedswill roll out. Facial recognition is especially significant
because itll create a kind of privacy discontinuity. Were used to a public semi-anonymitywhile
we know we may be spotted in a club or on the street, its unlikely that we will be. But as
security-camera and camera-phone pictures become searchable by face, that expectation will
slip away.
Police departments around the nation soon to have facial recognition systems
Farhad Manjoo, slate.com (2011)
According to the Wall Street Journal, police departments across the nation will soon adopt
handheld facial- recognition systems that will let them identify people with a snapshot. These
new capabilities are made possible by BI2 Technologies, a Massachusetts company that has
developed a small device that attaches to officers' iPhones. The police departments who spoke
to the Journal said they plan to use the device only when officers suspect criminal activity and
have no other way to identify a personfor instance, when they stop a driver who isn't carrying
her license. Law enforcement officials also seemed wary about civil liberties concerns. Is
snapping someone's photo from five feet away considered a search? Courts haven't decided the
issue, but sheriffs who spoke to the paper say they plan to exercise caution. Face-scanning has
an obvious advantage over fingerprints: It works from far away. Bunch of guys loitering on the
corner? Scantily clad woman hanging around that run-down motel? Two dudes who look like
they're smoking a funny-looking cigarette? Why not snap them all just to make sure they're on
the up-and-up?
Facial recognition for everyone
Farhad Manjoo, slate.com (2011)
In the coming yearsif not monthswe'll see a slew of apps that allow your friends and
neighbors to snap your face and get your name and other information you've put online. This
isn't a theoretical worry; the technology exists, now, to do this sort of thing crudely, and the only
thing stopping companies from deploying it widely is a fear of public outcry. That fear won't last
long. Face recognition for everyone is coming. Get used to it. What's changed in the last decade?
Three things. First, computers have gotten better at recognizing faces. The technology works by
analyzing dozens of different featuresthe distance between your eyes, the width of your nose
that remain the same across photographs. As computers have gotten faster and digital
photography has gotten better, face recognition has filtered down to consumer photo software.
Another major factor that augurs the face-recognition era is that we've become accustomed to
ubiquitous photography. Now that we all carry cameras everywhere, it no longer seems odd
when someone points a lens in your directionyou probably don't even notice it.
Finally, there's Facebook. Ten years ago we were worried about authorities building a worldwide
database of our faces. We're all posting pictures, and tagging names to pictures, at a furious rate
according to Facebook, people add 100 million names to faces on Facebook every day. The
face-recognition tools available to law enforcement agencies will match you against government
databasesthe DMV or passport database, or the FBI's most-wanted listbut the technology
available to consumers will be able to do just as well by matching your face to online snapshots.
The government couldn't have built a better facial database if it tried.

Naming names on the internet


NYT (2011)
Three years ago, after the suicide of a popular actress who had been bullied via the Internet,
South Korea introduced a radical policy aimed at stamping out online hate. It required
contributors to Web portals and other popular sites to use their real names, rather than
pseudonyms. Last month, after a huge security breach, the government said it would abandon
the system. Hackers stole 35 million Internet users national identification numbers, which they
had been required to supply when registering on Web sites to verify their identities. The South
Korean experience shows that real name policies are a lousy idea, and privacy threats are only
one reason. Online anonymity is essential for political dissidents, whose role has been
highlighted in the uprisings in the Arab world, and for corporate whistle-blowers. In the United
States, the Supreme Court has found a constitutional basis for protecting anonymity.
The psychology of online comments (1)
Maria Konnikova, New Yorker (2014)
Anonymity has also been shown to encourage participation; by promoting a greater sense of
community identity, users dont have to worry about standing out individually. Anonymity can
also boost a certain kind of creative thinking and lead to improvements in problem-solving. In a
study that examined student learning, the psychologists Ina Blau and Avner Caspi found that,
while face-to-face interactions tended to provide greater satisfaction, in anonymous settings
participation and risk-taking flourished. Anonymous forums can also be remarkably selfregulating: we tend to discount anonymous or pseudonymous comments to a much larger
degree than commentary from other, more easily identifiable sources. In a 2012 study of
anonymity in computer interactions, researchers found that, while anonymous comments were
more likely to be contrarian and extreme than non-anonymous ones, they were also far less
likely to change a subjects opinion on an ethical issue, echoing earlier results from the
University of Arizona. Removing comments also affects the reading experience itself: it may take
away the motivation to engage with a topic more deeply, and to share it with a wider group of
readers. In a phenomenon known as shared reality, our experience of something is affected by
whether or not we will share it socially. Take away comments entirely, and you take away some
of that shared reality, which is why we often want to share or comment in the first place. We
want to believe that others will read and react to our ideas.
The psychology of online comments (2)
Maria Konnikova, New Yorker (2014)
On September 24th, 2014, Popular Science announced that it would banish comments from its
Web site. The editors argued that Internet comments, particularly anonymous ones, undermine
the integrity of science and lead to a culture of aggression and mockery that hinders substantive
discourse. Even a fractious minority wields enough power to skew a readers perception of a
story, wrote the online-content director Suzanne LaBarre, citing a recent study from the
University of Wisconsin-Madison as evidence. But a ban on article comments may simply move
them to a different venue, such as Twitter or Facebookfrom a community centered around a
single publication or idea to one without any discernible common identity. Such large group
environments, in turn, often produce less than desirable effects, including a diffusion of
responsibility: you feel less accountable for your own actions, and become more likely to engage
in amoral behavior. In his classic work on the role of groups and media exposure in violence, the
social cognitive psychologist Alfred Bandura found that, as personal responsibility becomes more
diffused in a group, people tend to dehumanize others and become more aggressive toward
them.
The Disinhibition Effect
John Palfrey, Born Digital: Understanding the First Generation of Digital Natives (2008)
There are several possible reasons for this tendency to act more aggressively toward other
people online than face to face. Psychologists call it the disinhibition effect. Many people
young and old alikeare emboldened by the ability to be anonymous, feeling as if they will
never get caught, even though we all leave digital traces behind. And many people (and not just

Digital Natives) experience greater difficulty curbing their impulses online than they do in realspace social situations. Part of the issue is that there is a time delay between sending an e-mail
and getting one back. The absence of an authority figure in an unmediated space empowers
people to act on impulse.
Disinhibition, continued
John Freeman, The Tyranny of Email (2009)
Flaming can be induced in some people with alarming ease. Consider an experiment, reported in
2002 in The Journal of Language and Social Psychology, in which pairs of college students
strangerswere put in separate booths to get to know each other better by exchanging
messages in a simulated online chat room. While coming and going into the lab, the students
were well behaved. But the experimenter was stunned to see the messages many of the
students sent. About 20 percent of the e-mail conversations immediately became outrageously
lewd or simply rude. Psychologists call this behavior disinhibitiona filter drops, and we write
things we probably wouldnt say to another in person, at least not after a brief acquaintance. No
environment induces it quite as easily as computer-mediated communication. Indeed, the PC
may have extended the human mind, but its missing a few key human circuits that modulate
social interaction. Neurologists now know that many of the key mechanisms of communication
reside in the prefrontal cortex of the human brain. These circuits instantaneously monitor
ourselves and the other person during a live interaction, wrote Daniel Goleman on
www.edge.org, and automatically guide our responses so they are appropriate and smooth.
One of the key tasks of these circuits is inhibiting impulses for actions that would be rude or
simply inappropriateor outright dangerous.
Anonymity makes lying easier
Daniel Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security (2011)
When we talk about others, we affect not only their reputation but ours as well. If a person
gossips about inappropriate things, betrays confidences, spreads false rumors and lies, then her
own reputation is likely to suffer. People will view the person as untrustworthy and malicious.
They might no longer share secrets with the person. They might stop believing what the person
says. As U.S. Supreme Court Justice Antonin Scalia observed, anonymity can making lying easier;
and the identification of speakers can help significantly in deterring them from spreading false
rumors and can allow us to locate and punish the source of such rumors.
Confronting Your Inner Troll
Jason Lanier, You Are Not a Gadget: A Manifesto (2010)
Troll is a term for an anonymous person who is abusive in an online environment. It would be
nice to believe that there is a only a minute troll population living among us. But in fact, a great
many people have experienced being drawn into nasty exchanges online. I have tried to learn to
be aware of the troll within myself. I notice that I can suddenly become relieved when someone
else in an online exchange is getting pounded or humiliated, because that means Im safe for the
moment. If someone elses video is being ridiculed on YouTube, then mine is temporarily
protected. But that also means Im complicit in a mob dynamic. Have I ever planted a seed of
mob-beckoning ridicule in order to guide the mob to a target other than myself? Yes, I have,
though I shouldn't have. I observe others doing that very thing routinely in anonymous online
meeting places. Ive also found that I can be drawn into ridiculous pissing matches online in
ways that just wouldnt happen otherwise, and Ive never noticed any benefit. There is never a
lesson learned, or a catharsis of victory or defeat. If you win anonymously, no one knows, and if
you lose, you just change your pseudonym and start over, without having modified your point of
view one bit.

LINK
Online Anonymity and Photos
The new possibilities of photographing people in both public and more intimate situations,
coupled with more or less immediately posting such photographs and/or videos to a forum such
as a social networking site or more public webpage, means that people are now more vulnerable

to violations of privacy. If privacy can be defined as the capacity to control information about
oneself, the new ability of others to record and quickly distribute potentially embarrassing
information about oneself thereby decreases ones control over such information (e.g. in the
form of permission to take a photograph, much less permission to distribute the photograph in a
semi-public or public form).
Charles Ess, Digital Media Ethics (2010)

Anonymity Encourages "Drive-by Relationships"


As sociologist Robert Putnam observes: Anonymity and fluidity in the virtual world encourage
easy in, easy out, drive-by relationships. The very casualness is the appeal of computermediated communication for some
denizens of cyberspace, but it discourages the creation of social capital. If entry and exit are too
easy, commitment, trustworthiness, and reciprocity will not develop. In other words, anonymity
inhibits the process by which reputations are formed, which can have both good and bad
consequences. Not having accountability for our speech can be liberating and allow us to speak
more candidly; but it can also allow us to harm other people without being accountable for it.
Daniel Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security (2011)
We Are Naked Apes
David Shenk, Data Smog (1997)
We have also ignored, in our arrogance, an important lesson from evolutionary zoology: Though
culture moves much more swiftly than evolution, it cannot change the pace of evolution. For
[mankind], writes popular zoologist Desmond Morris, the main troubles will stem rom the fact
that his culturally operated advances will race ahead of any further genetic ones. His genes will
lag behind, and he will be constantly reminded that, for all his environment- molding
achievements, he is still a very naked ape.

Connect to power of facial recognition software and consequences of


anonymity on the internetIn-class on Thu Dec 17 (class 24)
We will watch the movie Citizenfour
Links:
(on Canvas lesson page)
Anonymity and the web

Sunday Story #13


Due Sun Dec 20 (before midnight)
Prompt
Assignment description.

Homework for Tue Dec 22 (class 25)


TBD
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter. .
Edward Snowden: The world says no to surveillance
(NYT op-ed, 2015)
Eric Holder: The Justice Department could strike deal with Edward
Snowden
Michael Isinoff (Yahoo! News, 2015)
Video
Watch the assigned video on Zaption and post a one-sentence response.
John Oliver interviews Edward Snowden (7:30)
http://zapt.io/trqnxjyx
Select an article or video from the lesson page links and Tweet its gist or
guiding question
Please include this address in your Tweet: @bhsecinternet
Respond to this online prompt on Canvas
Edward Snowden: patriot or traitor?

Edward Snowden: The world says no to surveillance


(NYT op-ed, 2015)
MOSCOW TWO years ago today, three journalists and I worked nervously in a Hong Kong hotel
room, waiting to see how the world would react to the revelation that the National Security
Agency had been making records of nearly every phone call in the United States. In the days
that followed, those journalists and others published documents revealing that democratic
governments had been monitoring the private activities of ordinary citizens who had done
nothing wrong.
Within days, the United States government responded by bringing charges against me under
World War I-era espionage laws. The journalists were advised by lawyers that they risked arrest
or subpoena if they returned to the United States. Politicians raced to condemn our efforts as unAmerican, even treasonous.
Privately, there were moments when I worried that we might have put our privileged lives at risk
for nothing that the public would react with indifference, or practiced cynicism, to the
revelations.
Never have I been so grateful to have been so wrong.
Two years on, the difference is profound. In a single month, the N.S.A.s invasive call-tracking
program was declared unlawful by the courts and disowned by Congress. After a White Houseappointed oversight board investigation found that this program had not stopped a single
terrorist attack, even the president who once defended its propriety and criticized its disclosure
has now ordered it terminated.
This is the power of an informed public.
Ending the mass surveillance of private phone calls under the Patriot Act is a historic victory for
the rights of every citizen, but it is only the latest product of a change in global awareness. Since
2013, institutions across Europe have ruled similar laws and operations illegal and imposed new
restrictions on future activities. The United Nations declared mass surveillance an unambiguous
violation of human rights. In Latin America, the efforts of citizens in Brazil led to the Marco Civil,
an Internet Bill of Rights. Recognizing the critical role of informed citizens in correcting the
excesses of government, the Council of Europe called for new laws to protect whistle-blowers.
Beyond the frontiers of law, progress has come even more quickly. Technologists have worked
tirelessly to re-engineer the security of the devices that surround us, along with the language of
the Internet itself. Secret flaws in critical infrastructure that had been exploited by governments
to facilitate mass surveillance have been detected and corrected. Basic technical safeguards
such as encryption once considered esoteric and unnecessary are now enabled by default
in the products of pioneering companies like Apple, ensuring that even if your phone is stolen,
your private life remains private. Such structural technological changes can ensure access to
basic privacies beyond borders, insulating ordinary citizens from the arbitrary passage of antiprivacy laws, such as those now descending upon Russia.
Though we have come a long way, the right to privacy the foundation of the freedoms
enshrined in the United States Bill of Rights remains under threat. Some of the worlds most
popular online services have been enlisted as partners in the N.S.A.s mass surveillance
programs, and technology companies are being pressured by governments around the world to
work against their customers rather than for them. Billions of cellphone location records are still
being intercepted without regard for the guilt or innocence of those affected. We have learned
that our government intentionally weakens the fundamental security of the Internet with back
doors that transform private lives into open books. Metadata revealing the personal
associations and interests of ordinary Internet users is still being intercepted and monitored on a
scale unprecedented in history: As you read this online, the United States government makes a
note.
Spymasters in Australia, Canada and France have exploited recent tragedies to seek intrusive
new powers despite evidence such programs would not have prevented attacks. Prime Minister

David Cameron of Britain recently mused, Do we want to allow a means of communication


between people which we cannot read? He soon found his answer, proclaiming that for too
long, we have been a passively tolerant society, saying to our citizens: As long as you obey the
law, we will leave you alone.
At the turning of the millennium, few imagined that citizens of developed democracies would
soon be required to defend the concept of an open society against their own leaders.
Yet the balance of power is beginning to shift. We are witnessing the emergence of a post-terror
generation, one that rejects a worldview defined by a singular tragedy. For the first time since
the attacks of Sept. 11, 2001, we see the outline of a politics that turns away from reaction and
fear in favor of resilience and reason. With each court victory, with every change in the law, we
demonstrate facts are more convincing than fear. As a society, we rediscover that the value of a
right is not in what it hides, but in what it protects.
Edward J. Snowden, a former Central Intelligence Agency officer and National Security Agency
contractor, is a director of the Freedom of the Press Foundation.

Eric Holder: The Justice Department could strike deal with Edward Snowden
Michael Isinoff (Yahoo! News, 2015)
https://www.yahoo.com/politics/eric-holder-the-justice-department-could-strike123393663066.html
Former Attorney General Eric Holder said today that a possibility exists for the Justice
Department to cut a deal with former NSA contractor Edward Snowden that would allow him to
return to the United States from Moscow.
In an interview with Yahoo News, Holder said we are in a different place as a result of the
Snowden disclosures and that his actions spurred a necessary debate that prompted
President Obama and Congress to change policies on the bulk collection of phone records of
American citizens.
Asked if that meant the Justice Department might now be open to a plea bargain that allows
Snowden to return from his self-imposed exile in Moscow, Holder replied: I certainly think there
could be a basis for a resolution that everybody could ultimately be satisfied with. I think the
possibility exists.
Holders comments came as he began a new job as a private lawyer at Covington & Burling, the
elite Washington law firm where he worked before serving as the nations top law enforcement
officer from February 2009 until last April.
In that capacity, Holder presided over an unprecedented crackdown on government leakers,
including the filing of a June 2013 criminal complaint against Snowden, charging him with three
felony violations of the Espionage Act for turning over tens of thousands of government
documents to journalists.
Holder had previously said in a January 2014 interview with MSNBC that the U.S. would be
willing to engage in conversation with Snowden and his lawyers were he willing to return to the
United States to face the charges, but ruled out any granting of clemency.
But his remarks to Yahoo News go further than any current or former Obama administration
official in suggesting that Snowdens disclosures had a positive impact and that the
administration might be open to a negotiated plea that the self-described whistleblower could
accept, according to his lawyer Ben Wizner.
The former attorney generals recognition that Snowdens actions led to meaningful changes is
welcome, said Wizner. This is significant I dont think weve seen this kind of respect from
anybody at a Cabinet level before.
Holder declined to discuss what the outlines of a possible deal might consist of, saying that as
the former attorney general, it would not be appropriate for him to discuss it.
Its also not clear whether Holders comments signal a shift in Obama administration attitudes
that could result in a resolution of the charges against Snowden. Melanie Newman, chief
spokeswoman for Attorney General Loretta Lynch, Holders successor, immediately shot down
the idea that the Justice Department was softening its stance on Snowden.
This is an ongoing case so I am not going to get into specific details but I can say our position
regarding bringing Edward Snowden back to the United States to face charges has not changed,
she said in an email.
Three sources familiar with informal discussions of Snowdens case told Yahoo News that one top
U.S. intelligence official, Robert Litt, the chief counsel to Director of National Intelligence James
Clapper, recently privately floated the idea that the government might be open to a plea bargain
in which Snowden returns to the United States, pleads guilty to one felony count and receives a
prison sentence of three to five years in exchange for full cooperation with the government.
Litt declined to comment. A source close to Litt said any comments he made were personal and

did not represent the position of the U.S. government. The source also said Litt has made clear to
Snowdens representatives that nothing is going to happen unless he comes in and moves off
this idea, Im entitled to a medal.
But Wizner, Snowdens lawyer, said any felony plea by Snowden that results in prison time would
be unacceptable to his client. Our position is he should not be reporting to prison as a felon and
losing his civil rights as a result of his act of conscience, he said.
Moreover, any suggestion of leniency toward Snowden would likely run into strong political
opposition in Congress as well as fierce resistance from hard-liners in the intelligence community
who remain outraged over his wholesale disclosure of highly classified government documents.
Those feelings have, in some ways, been exacerbated by Snowdens worldwide celebrity that
recently prompted him to enter into an arrangement with a speakers bureau that has allowed
him to give paid talks to worldwide audiences via Skype from his apartment in Moscow.
Im quite stunned that we would be considering any return of Snowden to this country other
than to meet a jury of his peers, period, said Michael Hayden, former director of both the NSA
and CIA under President George W. Bush, when asked about Holders comments.
What Snowden did, however, was the greatest hemorrhaging of legitimate American secrets in
the history of the republic, no question about it, Hayden added.
Whatever happens, Snowdens legal fate wont be in Holders hands. In the interview, he said he
planned to concentrate on giving strategic advice to corporate clients at Covington but no
lobbying while also engaging in significant pro bono work, including starting a foundation to
promote issues such as criminal justice reform.

Holder also said he has already had interactions with Hillary Clintons presidential campaign
and expects to be helpful, including possibly speaking at campaign events and providing advice.
That will be up to the campaign, he said. Whatever the nominee wants. In-class

resources for Tue Dec 22 (class 25)


TBD
Digerati video and reading
(discussion leader: x)
Digerati
David Weinberger
@dweinberger
https://en.wikipedia.org/wiki/David_Weinberger
Video
We will watch and discuss this in class.
David Weinberger on Wikileaks (1:28)
https://www.youtube.com/watch?v=tAHoqYRNYPk&list=PLBE8B57E740B9BF32&index=26
Reading
We will read and discuss this in class.
David Weinberger: Is the web different?
(David Weinberger, 2008)
Homework video and reading
(discussion leader: y)
Current Events / iPad activity
(discussion leader: z)
Links:
(on Canvas lesson page)
David Weinberger

David Weinberger: Is the web different?


The question "Is the Web different?" is actually not so much a question as a shibboleth in the original sense: The answer
determines which tribe you're in.
The Web utopians point to the ways in which the Web has changed some of the basic assumptions about how we
live together, removing old obstacles and enabling shiny new possibilities.
The Web dystopians agree that the Web is having a major effect on our lives. They, however, think that effect is
detrimental.
The Web realists say the Web hasn't had nearly as much effect as the utopians and dystopians proclaim. The Web
carries with it certain possibilities and limitations, but (the realists say) not many more than other major
communications medium.
Each of these is a political position: They imply normative beliefs, and they lead their holders to certain types of behaviors
and actions:
The utopians want the Web to have wide effects as quickly as possible. They therefore favor connecting as many
people as possible and maintaining the Web as an open, public space.
The dystopians want to curb the excesses of the Web, or prepare us to deal with those excesses.
The realists want to curb the excesses of the utopians who, they think, are feeding unrealistic expectations.
Simply the act of holding the position is itself a political action for all three groups:
The utopians think that by holding out a vision of what will or might be, they are affecting the direction of the
present.
The dystopians are sounding a call to action, even if some dystopians think that we are doomed to suffer under the
Web's increasing hegemony.
The realists may not view their position as political because it is they believe based merely on a clear-eyed,
non-politicized view of the world. But this is itself a political decision that leans toward supporting the status quo
because what-is is more knowable than what might be.
So, which of the three positions or some variant is right? Is the Web different in a way that matters?
The obvious answer to the question "Which one is right?" is: Time will tell.
Unfortunately, time papers over all wounds. Our values change, so our evaluations of change shift over time. The
extraordinary becomes ordinary with extraordinary rapidity and insinuates itself into memory, undercutting the reliability of our
judgments about the magnitude of change. So, time will not tell.
Nor is this a simple fact-based issue. Realists would like it to be, but that's what makes them realists. Consider this
hypothetical exchange:
Realist: You say that the Web will transform politics. But politics is as it ever was.
Utopian: Just wait.
This is, indeed, one of the two basic blocking tactics used by Web utopians: The changes are so important that they will take
a while to arrive, and the changes are so fundamental that we aren't always even aware of them. Here's an example of the
second tactic at work:
Realist: You say that the Web will transform business, but business is as it ever was.
Utopian: Not at all! For example, email has transformed meetings, but we're so used to the change that we don't
even recognize it.

To this, the Web realist has a number of responses: Denying that the changes are real, that they are important, or that they
are due to the Web.
When a dystopian points to a bad effect of the Web, the utopian denies the truth of the value claim, its inevitability, or its
importance:
Dystopian: The Web has made pornography available to every schoolchild!
Utopian: It is the responsibility of parents to make sure their kids are using child-safe filters. Besides, viewing
pornography may weaken our unhealthy anti-sexual attitudes. Besides, greater access to porn is just one effect of
the Web; it's brought greater access to literature, art, science...
The realist wants to bring the argument squarely within the realm of facts. Facts can, of course, resolve some disputes. But
facts are unlikely to settle the overall question of the Web's difference because the utopians, dystopians and realists are
probably operating from different views of history, and the framing of history also frames facts.
Many utopians think the Web has uncanny power because they are McLuhanites who think media transform institutions and
even consciousness. The McLuhanites' belief in the shaping power of media leads them to a rhetoric of "not only": Not only
did the printing press enable the spread of literacy, it led to our reliance on experts. The next McLuhanite up says, "Not only
did it lead to experts, it actually changed the shape of knowledge." Web utopians engage in the same rhetorical oneupmanship.
Many Web dystopians share the utopians' disruptive view of the Web, although they are struck more by the facts with
negative values.
Many Web realists think change happens far more incrementally. They feel the inertial weight of existing institutions and
social structures. Nothing as trivial as HTML will change the fact that most of the world is in poverty and that corrupt
corporations are firmly in control.
These positions about how history works cannot be defended by looking at history, for they determine how history is to be
read. For example, did the Howard Dean campaign in 2004 show that the Web is profoundly altering politics, that the Web
has had little effect on politics, or that the Web is further degrading politics? All three positions are defensible because
historical events such as presidential campaigns are carried along social wavefronts of unfathomable complexity. Did Dean
get as far as he did because of the Web or because of the media? Did his campaign fail because the Web created a bubble
of self-involvement, because the Web ultimately did not get people out to vote, or because he was a quirky candidate who,
without the Web, wouldn't have been noticed outside of his home state of Vermont?
To make matters yet more complex, holders of these three positions are not merely uttering descriptive statements.
Frequently, they speak in order to have a political effect:
Utopians want to excite us about the future possibilities because they want policies that will keep the Internet an
open field for bottom-up innovation.
Dystopians want to warn us of the dangers of the Web so we can create policies and practices that will mitigate
those dangers.
Realists want to clear away false promises so we can focus on what really needs to be done. Also, they'd like the
blowhard utopians to just shut up for a while.
Arguments that have different aims and are based on differing views of how history works and of the nature of the
interactions between the material and social realms are not settled by facts. In fact, they're not settled. Ever. Even after the
changes happen, these three temperaments and cognitive sets will debate why the changes happened, how significant they
were, and whether they were good, bad or indifferent.
Time won't tell.
Unfortunately, we can't afford to wait for time not to tell us. "Is the Web different?" is an urgent question. Decisions depend on
our answer.

For example, if the Web utopians are right if the Web is transformative in an overall positive way then it's thus morally
incumbent upon us to provide widespread access to as much of the world as is possible, focusing on the disadvantaged. If
the Web dystopians are right, we need to put in place whatever safeguards we can. If the realists are right, then we ought to
make tactical adjustments but ignore the hyperventilations of the utopians and dystopians.
Then there are the more localized decisions. If the Web is transforming business, for better or for worse, then businesses
need to alter their strategic plans. If the Web is merely one more way information travels, then businesses should be looking
only at tactical responses. Likewise for every other institution that deals with information, including government, media,
science, and education.
So, we need to decide.
But there is no way to decide.
Fortunately, this is not the first time we humans have been in this position. In fact, it is characteristic of politics overall. Who's
right, the liberals, the conservatives, or neither? Because such a question can't be answered to the satisfaction of all the
parties involved, we come up with political means for resolving issues. For politics to work in helping us to decide what to do
about and with the Web, we need all three positions plus the incalculable variants represented.
Together we'll settle the future's hash.
But I don't want to leave it at that happy, liberal conclusion because it is, I believe, incomplete. The fuller statement of the
conclusion should include: It is vital to have realists in the discussion, but they are essentially wrong.
I am using the word "essentially" carefully here. Web realists are often right in their particular arguments, demurrals and
corrections, and the utopians and dystopians are often wrong in their predictions, readings, and even facts. That matters. Yet,
the essence of the utopian and dystopian view is that the Web is truly different. About that they are right.
Why? I am enough of a McLuhanite to believe that media do not simply transmit messages. The means by which we
communicate has a deep, profound and even fundamental effect on how we understand ourselves and how we associate
with one another. Yes, the medium is the message.
If that's the case (and notice I am not giving any further argument for it), then there are good reasons to think that the Web as
a medium is likely to be as disruptive as other media that have had profound effects on culture. Perhaps the best comparison
is to the effect Gutenberg's invention has had on the West. Access to printed books gave many more people access to
knowledge, changed the economics of knowledge, undermined institutions that were premised on knowledge being scarce
and difficult to find, altered the nature and role of expertise, and established the idea that knowledge is capable of being
chunked into stable topics. These in turn affected our ideas about what it means to be a human and to be human together.
But these are exactly the domains within which the Web is bringing change. Indeed, it is altering not just the content of
knowledge but our sense of how ideas go together, for the Web is first and foremost about connections.
Clearly, there is much more to say about this, and much has already been said. But that is the general shape of one Web
utopian argument.
It can, of course, be challenged. It should be challenged, both in its outline and in its particulars. Here Web realists have a
vital role to play. But at the highest level of abstraction, these three positions are not truly arguable. Each is an expression of
an attitude towards the future, and the future is that which does not yet exist. None of these three positions truly knows what
the future holds if only because the prevalence of these positions itself shapes the unknown future.
And that is a reason to join the utopian tribe, or at least to acknowledge the special value it brings to the conversation.
Innovation requires the realism that keeps us from wasting time on the impossible. But some of the most radical innovation
requires ignoring one's deep-bred confidence about what is possible. This is especially true within the social realm where the
limits on new ways to associate are almost always transgressible simply by changing how we think about ourselves. We thus
need utopians to invent the impossible future.
And we need lots and lots of them. There is so much to invent, and the new forms of association that emerge often only
succeed if there are enough people to embrace them.
Web realists perform the vital function of keeping us from running down dead ends longer than we need to, and from getting
into feedback loops that distort the innovation process. For those services, we should thank and encourage the realists. But
we should also recognize that beyond the particulars, they are essentially wrong.

The contention among dystopians, realists and utopians is is a struggle among the past, the present and the future. The
present is always right about itself but in times of disruption essentially wrong about the future. That's why we need to
flood the field with utopians so we can be right often enough that we build the best future we can.
It is, of course, simply an accident that this defense of Web utopianism comes from someone who is personally a Web
utopian. Absolutely coincidental.

Uh huh.No Sunday
Happy New Year!

Story for Sun Jan 3

Homework for Tue Jan 5 (class 26)


No homework for Tue Jan 5: Happy New Year!

In-class resources for Tue Jan 5 (class 26)


Apples to Apples

We will play the game Apples to Apples in class.

Homework for Thu Jan 7 (class 27)


Is the Unabomber right?
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
The Unabomber Was Right
Kevin Kelly (2011)
Excerpts: Technological determinism
Video
Watch the assigned video on Zaption and post a one-sentence response.
ABC News report on capture of Ted Kaczynski (6:20)
http://zapt.io/ttrbwdgu
Online discussion
Respond to this prompt on Canvas:
Is the Unabomber right?
link

The Unabomber Was Right


Kevin Kelly (2011)
Ted Kaczynski, the convicted bomber who blew up dozens of technophilic professionals, was right
about one thing: technology has its own agenda. The technium is not, as most people think, a
series of individual artifacts and gadgets for sale. Rather, Kaczynski, speaking as the
Unabomber, argued that technology is a dynamic holistic system. It is not mere hardware; rather
it is more akin to an organism. It is not inert, nor passive; rather the technium seeks and grabs
resources for its own expansion. It is not merely the sum of human action, but in fact it
transcends human actions and desires. I think Kaczynski was right about these claims. In his own
words the Unabomber says: The system does not and cannot exist to satisfy human needs.
Instead, it is human behavior that has to be modified to fit the needs of the system. This has
nothing to do with the political or social ideology that may pretend to guide the technological
system. It is the fault of technology, because the system is guided not by ideology but by
technical necessity.
I too argue that the technium is guided by technical necessity. That is, baked into the nature of
this vast complex of technological systems are self-serving aspects technologies that enable
more technology, and systems that preserve themselves and also inherent biases that lead
the technium in certain directions, outside of human desire. Kaczynski writes modern
technology is a unified system in which all parts are dependent on one another. You cant get rid
of the bad parts of technology and retain only the good parts.
The truth of Kaczynskis observations does not absolve him of his murders, or justify his insane
hatred. Kaczynski saw something in technology that caused him to lash out with violence, but
despite his mental imbalance, he was able to articulate that view with surprising clarity his
sprawling, infamous 35,000-word manifesto. Kaczynski murdered three people (and injured 23
more) in order to get this manifesto published. His despicable desperation and crimes hide a
critique that has gained a minority following by other luddites. The center section of his
argument is clear, remarkably so, given his cranky personal grievances against leftists that
bookend his rant. Here, in meticulous, scholarly precision, Kaczynski makes his primary claim
that freedom and technological progress are incompatible, and that therefore technological
progress must be undone.
As best I understand, the Unabombers argument goes like this:
Personal freedoms are constrained by society, as they must be.
The stronger that technology makes society, the less freedoms.
Technology destroys nature, which strengthens technology further.
This ratchet of technological self-amplification is stronger than politics.
Any attempt to use technology or politics to tame the system only strengthens it.
Therefore technological civilization must be destroyed, rather than reformed.
Since it cannot be destroyed by tech or politics, humans must push industrial society towards its
inevitable end of self-collapse.
Then pounce on it when it is down and kill it before it rises again.

Kaczynski argues that it is impossible to escape the ratcheting clutches of industrial technology
for several reasons. One, because if you use any part of it, the system demands servitude; two,
because technology does not reverse itself, never releasing what is in its hold; and three,
because we dont have a choice of what technology to use in the long run. In his words, from the
Manifesto:
The system HAS TO regulate human behavior closely in order to function. At work, people have
to do what they are told to do, otherwise production would be thrown into chaos. Bureaucracies
HAVE TO be run according to rigid rules. To allow any substantial personal discretion to lowerlevel bureaucrats would disrupt the system and lead to charges of unfairness due to differences
in the way individual bureaucrats exercised their discretion. It is true that some restrictions on
our freedom could be eliminated, but GENERALLY SPEAKING the regulation of our lives by large

organizations is necessary for the functioning of industrial-technological society. The result is a


sense of powerlessness on the part of the average person.
It is not possible to make a LASTING compromise between technology and freedom, because
technology is by far the more powerful social force and continually encroaches on freedom
through REPEATED compromises. Another reason why technology is such a powerful social force
is that, within the context of a given society, technological progress marches in only one
direction; it can never be reversed. Once a technical innovation has been introduced, people
usually become dependent on it, unless it is replaced by some still more advanced innovation.
Not only do people become dependent as individuals on a new item of technology, but, even
modre, the system as a whole becomes dependent on it.
When a new item of technology is introduced as an option that an individual can accept or not as
he chooses, it does not necessarily REMAIN optional. In many cases the new technology changes
society in such a way that people eventually find themselves FORCED to use it.
Kaczynski felt so strongly about the last point that he repeated it once more in a different section
of his treatise. It is an important criticism. Once you accept that individuals surrender freedom
and dignity to the machine and that they increasingly have no choice but to do so, then the
rest of Kaczynskis argument flows fairly logically:
But we are suggesting neither that the human race would voluntarily turn power over to the
machines nor that the machines would willfully seize power. What we do suggest is that the
human race might easily permit itself to drift into a position of such dependence on the
machines that it would have no practical choice but to accept all of the machines decisions. As
society and the problems that face it become more and more complex and machines become
more and more intelligent, people will let machines make more of their decision for them, simply
because machine-made decisions will bring better result than man-made ones. Eventually a
stage may be reached at which the decisions necessary to keep the system running will be so
complex that human beings will be incapable of making them intelligently. At that stage the
machines will be in effective control. People wont be able to just turn the machines off, because
they will be so dependent on them that turning them off would amount to suicide. .. Technology
will eventually acquire something approaching complete control over human behavior.
Will public resistance prevent the introduction of technological control of human behavior? It
certainly would if an attempt were made to introduce such control all at once. But since
technological control will be introduced through a long sequence of small advances, there will be
no rational and effective public resistance.
I find it hard to argue against this last section. It is true that as the complexity of our built world
increases we will necessarily need to rely on mechanical (computerized) means to managing this
complexity. We already do. Autopilots fly our very complex flying machines. Algorithms control
our very complex communications and electrical grids. And for better or worse, computers
control our very complex economy. Certainly as we construct yet more complex infrastructure
(location-based mobile communications, genetic engineering, fusion generators, autopilot cars)
we will rely further on machines to run them and make decisions. For those services, turning off
the switch is not an option. In fact, if we wanted to turn off the internet right now, it would not be
easy to do if others wanted to keep it on. In many ways the internet is designed to never turn off.
First let us postulate that the computer scientists succeed in developing intelligent machines
that can do all things better than human beings can do them. In that case presumably all work
will be done by vast, highly organized systems of machines and no human effort will be
necessary. Either of two cases might occur. The machines might be permitted to make all of
their own decisions without human oversight, or else human control over the machines might be
retained.
If the machines are permitted to make all their own decisions, we can't make any conjectures as
to the results, because it is impossible to guess how such machines might behave. We only point
out that the fate of the human race would be at the mercy of the machines. It might be argued
that the human race would never be foolish enough to hand over all the power to the machines.

But we are suggesting neither that the human race would voluntarily turn power over to the
machines nor that the machines would willfully seize power. What we do suggest is that the
human race might easily permit itself to drift into a position of such dependence on the
machines that it would have no practical choice but to accept all of the machines' decisions. As
society and the problems that face it become more and more complex and machines become
more and more intelligent, people will let machines make more of their decisions for them,
simply because machine-made decisions will bring better results than man-made ones.
Eventually a stage may be reached at which the decisions necessary to keep the system running
will be so complex that human beings will be incapable of making them intelligently. At that
stage the machines will be in effective control. People won't be able to just turn the machines
off, because they will be so dependent on them that turning them off would amount to suicide.
On the other hand it is possible that human control over the machines may be retained. In that
case the average man may have control over certain private machines of his own, such as his
car or his personal computer, but control over large systems of machines will be in the hands of
a tiny elite - just as it is today, but with two differences. Due to improved techniques the elite will
have greater control over the masses; and because human work will no longer be necessary the
masses will be superfluous, a useless burden on the system. If the elite is ruthless they may
simply decide to exterminate the mass of humanity. If they are humane they may use
propaganda or other psychological or biological techniques to reduce the birth rate until the
mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consists of softhearted liberals, they may decide to play the role of good shepherds to the rest of the human
race. They will see to it that everyone's physical needs are satisfied, that all children are raised
under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him
busy, and that anyone who may become dissatisfied undergoes "treatment" to cure his
"problem." Of course, life will be so purposeless that people will have to be biologically or
psychologically engineered either to remove their need for the power process or make them
"sublimate" their drive for power into some harmless hobby. These engineered human beings
may be happy in such a society, but they will most certainly not be free. They will have been
reduced to the status of domestic animals.

Thorstein Veblen and Technological Determinism (and others)


Niel Postman, Technopoly: The Surrender of Culture to Technology (1993)
For centuries, historians and philosophers have traced, and debated, technologys role in
shaping civilization. Some have made the case for what the sociologist Thorstein Veblen dubbed
technological determinism. Theyve argued that technological progress, which they see as an
autonomous force outside mans control, has been the primary factor influencing the course of
human history. Karl Marx gave voice to this view when he wrote, The windmill gives you society
with the feudal lord; the steam-mill, society with the industrial capitalist. Ralph Waldo Emerson
put it more crisply: Things are in the saddle/And ride mankind. In the most extreme expression
of the determinist view, human beings become little more than the sex organs of the machine
world. McLuhan memorably wrote in the Gadget Love chapter of Understanding Media. Our
essential role is to produce ever more sophisticated tools to fecundate machines as bees
fecundate plants until technology has developed the capacity to reproduce itself on its own. At
that point, we become dispensable.

What technology wants


Eli Pariser, The Filter Bubble: What the Internet is Hiding from You (2011)
Kevin Kelly, the founding editor of Wired, wrote perhaps the boldest book articulating the
technodeterminist view, What Technology Wants, in which he posits that technology is a
seventh kingdom of life, a kind of meta-organism with desires and tendencies of its own. Kelly
believes that the technium, as he calls it, is more powerful than any of us mere humans.
Ultimately, technologya force that wants to eat power and expand choicewill get what it
wants whether we want it to or not.
Technodeterminism is alluring and convenient for newly powerful entrepreneurs because it
absolves them of responsibility for what they do. Like priests at the altar, theyre mere vessels of
a much larger force that it would be futile to resist. They need not concern themselves with the
effects of the systems theyve created. But technology doesnt solve every problem of its own
accord. If it did, we wouldnt have millions of people starving to death in a world with an
oversupply of food.
Technodeterminists like to suggest that technology is inherently good. But despite what Kevin
Kelly says, technology is no more benevolent than a wrench or a screwdriver. Its only good when
people make it do good things and use it in good ways. Melvin Kranzberg, a professor who
studies the history of technology, put it best nearly thirty years ago, and his statement is now
known as Kranzbergs first law: Technology is neither good or bad, nor is it neutral.

Bill Joy and KMD


Bill Joy, WIRED magazine (2000)
The 21st-century technologies - genetics, nanotechnology, and robotics (GNR) - are so powerful
that they can spawn whole new classes of accidents and abuses. Most dangerously, for the first
time, these accidents and abuses are widely within the reach of individuals or small groups. They
will not require large facilities or rare raw materials. Knowledge alone will enable the use of
them.Thus we have the possibility not just of weapons of mass destruction but of knowledgeenabled mass destruction (KMD), this destructiveness hugely amplified by the power of selfreplication. I think it is no exaggeration to say we are on the cusp of the further perfection of
extreme evil, an evil whose possibility spreads well beyond that which weapons of mass
destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of
extreme individuals.

Survival of the fittest


Bill Joy, WIRED magazine (2000)
Biological species almost never survive encounters with superior competitors. Ten million years
ago, South and North America were separated by a sunken Panama isthmus. South America, like
Australia today, was populated by marsupial mammals, including pouched equivalents of rats,
deers, and tigers. When the isthmus

connecting North and South America rose, it took only a few thousand years for the northern
placental species, with slightly more effective metabolisms and reproductive and nervous
systems, to displace and eliminate almost all the southern marsupials.
In a completely free marketplace, superior robots would surely affect humans as North American
placentals affected South American marsupials (and as humans have affected countless
species). Robotic industries would compete vigorously among themselves for matter, energy,
and space, incidentally driving their price beyond human reach. Unable to afford the necessities
of life, biological humans would be squeezed out of existence.

In-class resources for Thu Jan 7 (class 27)


Is the Unabomber right?
Digerati video and reading
Digerati
Ted Kaczynski
https://en.wikipedia.org/wiki/Ted_Kaczynski
Kevin Kelly
@kevin2kelly
http://kk.org
Video
We will watch and discuss this in class.
The Unabombers cabin (2:30)
https://www.youtube.com/watch?v=PomekV8jl54
Reading
We will read and discuss this in class.
The Unabomber Was Right
(Kevin Kelly, 2009)
http://kk.org/thetechnium/the-unabomber-w/
Links:
(on Canvas lesson page)
The Unabomber
Kevin Kelly
Bill Joy
Will technology become human
Ray Kurzweil

The last Sunday Story


Due Sun Jan 10 (before midnight)
Prompt
medium collection internet art

Homework for Tue Jan 12 (class 28)


Internet Art
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Excerpts: Internet art
Video
Watch the assigned video on Zaption and post a one-sentence response.
Aaron Koblin TED talk (18:14)
Watch at least 6 minutes of this video.
http://zapt.io/tk5gc6vz

Today's Self-portraits: Modern Art


Christine Rosen, The New Atlantis (2007)
For centuries, the rich and the powerful documented their existence and their status through
painted portraits. A marker of wealth and a bid for immortality, portraits offer intriguing hints
about the daily life of their subjects professions, ambitions, attitudes, and, most importantly,
social standing. Such portraits, as German art historian Hans Belting has argued, can be
understood as painted anthropology, with much to teach us, both intentionally and
unintentionally, about the culture in which they were created.
Self-portraits can be especially instructive. By showing the artist both as he sees his true self and
as he wishes to be seen, self-portraits can at once expose and obscure, clarify and distort. They
offer opportunities for both self-expression and self-seeking. They can display egotism and
modesty, self-aggrandizement and self- mockery.
Today, our self-portraits are democratic and digital; they are crafted from pixels rather than
paints. On social networking websites like MySpace and Facebook, our modern self-portraits
feature background music, carefully manipulated photographs, stream-of-consciousness
musings, and lists of our hobbies and friends. They are interactive, inviting viewers not merely to
look at, but also to respond to, the life portrayed online. We create them to find friendship, love,
and that ambiguous modern thing called connection. Like painters constantly retouching their
work, we alter, update, and tweak our online self-portraits; but as digital objects they are far
more ephemeral than oil on canvas. It is the timeless human desire for attention that emerges as
the dominant theme of these vast virtual galleries.

In-class resources for Tue Jan 12 (class 28)


Internet Art
Digerati video and reading
Digerati
Douglas Rushkoff
@rushkoff
BHSEC Q guest author (Spring 2015)
http://www.rushkoff.com
Video
We will watch and discuss this in class.
Douglas Rushkoff on Colbert Report (5:11)
http://thecolbertreport.cc.com/videos/eyzxx1/douglas-rushkoff
alternate video:
Douglas Rushkoff PSFK talk about Present Shock (14:36)
https://vimeo.com/65904419
Reading
We will read and discuss this in class.
Present Shock: When Everything Happens Now (or alternative)
(Douglas Rushkoff, 2014)
Links:
(on Canvas lesson page)
Internet art

Present Shock: When Everything Happens Now (or alternative)


Our New Society of Authorship
Will Richardson, Personal Learning Networks (2011)
No matter how you look at it, we are creating what author Douglas Rushkoff calls a society of
authorship where every teacher and every studentevery person with accesswill have the
ability to contribute ideas and experience to the larger body of knowledge that is the internet.
And in doing so, Rushkoff says, we will be writing the human story, in real time, together, a vision
that asks each of us to participate. The ability to easily publish text, pictures, and video is
changing the face of journalism and media as we know it as well. There is no better example
than coverage of the heartbreaking Indian Ocean earthquake and resulting tsunami that killed
upwards of 150,000 people just after Christmas in 2004 (or, for that matter, the horrible
devastation caused by Hurricane Katrina in New Orleans in the summer of 2005). Within minutes
of the event, links to gripping first---person accounts coupled with digital photos and video were
spreading throughout the blogosphere, providing the type of raw detail that usually wouldnt
appear in the media.
The Origins of Remix
Douglas Rushkoff, Program or Be Programmed: Ten Commands for the Digital Age (2010)
The nets bias toward collaboration has also yielded some terrific mass participatory projects,
from technologies such as the Firefox browser and Linux operating system to resources like
Wikipedia. As examples of collective activity, they demonstrate our ability to work together and
share the burden in order to share yet again in the tool we have gained. For many, it is a political
act and a personal triumph to participate in these noncommercial projects and to do so for
reasons other than money.
These experiences and tools have, in turn, engendered an online aesthetic that is itself based in
sharing and repurposing the output of others. As early as the 1920s, artists called the Dadaists
began cutting up text and putting it together in new ways. In the 1960s, writers and artists such
as William Burroughs and Brion Gysin were experimenting with the technique, physically cutting
up a newspaper or other text object into many pieces and then recombining them into new
forms. They saw it as a way to break through the hypnosis of traditional media and see beyond
its false imagery to the real messages and commands its controllers were trying to transmit to us
without our knowledge.
Digital technology has turned this technique from a fringe art form to a dominant aesthetic.
From the record scratching of a deejay to the cut- and-paste functions of the text editor, our
media is now characterized by co-opting, repurposing, remixing, and mashing-up. Its not simply
that a comic book becomes a movie that becomes a TV series, a game, and then a musical on
which new comic books are based. Although slowly mutating, thats still a single story or brand
moving through different possible incarnations. What were in the midst of now is a mediaspace
where every creation is fodder for every other one.

Homework for Tue Jan 14 (class 29)


Jeopardy!
Reading
Consider and mark-up the assigned reading, with pen and/or highlighter.
Online discussion
Respond to this prompt on Canvas:
How has this course made you reconsider how you live your digital life?
link

Jeopardy!

prepIn-class resources for Tue Jan 14 (class 29)


Jeopardy!

We will play the game Jeopardy in class.

Homework for Tue Jan 19 (class 30)


Wrapping up
Online discussion
Respond to the post-course survey sent to you via email (Google Form):
Post-course survey
link

In-class resources for Tue Jan 19 (class 30)


Wrapping up
Video
We will watch and discuss this in class.
Web 3.0 (14:24)
https://vimeo.com/11529540
Links:
(on Canvas lesson page)
Misc.
Marshall McLuhan

Cyberwar
iWar
Johnny Ryan, A History of the Internet and the Digital Future (2010)
iWar can be waged by nations, corporations, communities, or by any reasonably tech-savvy
individual. This is a fundamental shift in the balance of offensive capability, empowering
individuals with the power to threaten the activities of governments and large corporations.
About Cyber War
Richard Clarke, CyberWar: The Next Threat to National Security (2010)
Cyber war is real. What the United States and other nations are capable of doing in a cyber war
could devastate a modern nation. Cyber war happens at the speed of light. As the photons of the
attack packets stream down fiber-optic cable, the time between the launch of an attack and its
effect is barely measurable, thus creating risks for crisis decision makers. Cyber war is global. In
any conflict, cyber attacks rapidly go global, as covertly acquired or hacked computers and
servers throughout the world are kicked into service. Cyber war skips the battlefield. Systems
that people rely upon, from banks to air defense radars, are accessible from cyberspace and can
be quickly taken over or knocked out without first defeating a countrys traditional defenses.
Cyber war has begun. In anticipation of hostilities, nations are already preparing the
battlefield. They are hacking into each others networks and infrastructures, laying in trapdoors
and logic bombs--now, in peacetime. This ongoing nature of cyber war, the blurring of peace and
war, adds a dangerous new dimension of instability.
iWar is for everyone
Johnny Ryan, A History of the Internet and the Digital Future (2010)
iWar, perhaps for the first time, is liberated from the expense and effort that traditionally inhibits
offensive action against geographically distant targets. Conventional destruction of targets by
kinetic means is enormously expensive and comparatively slow. A single B-2 Spirit stealth
bomber, which costs US $2.1 billion to develop and build, must fly from Whiteman Air Force base
in Missouri in order to drop ordinance on a target in Afghanistan. iWar, though it delivers far less
offensive impact, can inflict damage from any point on the globe at a target anywhere on the
globe at virtually no cost. For the same reason, iWar will proliferate quickly across the globe. It is
not limited by the geographical constraints that impeded the spread of earlier military
innovations. The proliferation of gunpowder in Europe puts this in perspective; appearing in
China in the seventh or eighth century, gunpowder finally made its European debut as late as
1314, first in Flanders, then in England seven years later, and in France five years after that. In
contrast, new tools and know-how necessary to wage iWar proliferate easily across the Internet.
iWar is deniable
Johnny Ryan, A History of the Internet and the Digital Future (2010)
iWar can be waged anonymously and is difficult to punish. iWar is deniable. Even if official
culpability could be proven, it is unclear how one state should respond to an iWar attack by
another. A criminal investigation would be no less problematic. If digital forensic investigation
could trace a malicious botnet to a single computer controlled a Denial of Service (DOS) attack, it
is unlikely that effective action could be taken to prosecute. The culpable computer might be in
another jurisdiction from which cooperation would not be forthcoming. If cooperation were
forthcoming, the culpable computer might have been operated from an Internet caf or at
another anonymous public connectivity site, making it impossible to determine who among the
many transient users was involved in a DOS attack that typically lasts only a short period of
time.
About Logic Bombs
Richard Clarke, CyberWar: The Next Threat to National Security (2010)
The idea of logic bombs is simple. In addition to leaving behind a trapdoor in a network so you
can get back in easily, without setting off alarms and without needing an account, cyber warriors

often leave behind a logic bomb so they dont have to take the time to upload it later on when
they need to use it. A logic bomb in its most basic form is simply an eraser, it erases all the
software on a computer, leaving it a useless hunk of metal. More advanced logic bombs could
first order hardware to do something to damage itself, like ordering an electric grid to produce a
surge that fries circuits in transformers, or causing an aircrafts control surfaces to go into the
dive position. Then it erases everything, including itself.
The CIA has used logic bombs before
Richard Clarke, CyberWar: The Next Threat to National Security (2010)
Americas national security agencies are now getting worried about logic bombs, since they
seem to have found them all over our electric grid. There is a certain irony here, in that the U.S.
military invented this form of warfare. One of the first logic bombs, and possibly the first
incidence of cyber war, occurred before there even really was much of an Internet. In the early
1980s, the Soviet leadership gave their intelligence agency, the KGB, a shopping list of Western
technologies they wanted their spies to steal for them. A KGB agent who had access to the list
decided he would rather spend the rest of his days sipping wine in a Paris caf than freezing in
Stalingrad, so he turned the list over to the French intelligence service in exchange for a new life
in France. France, which was part of the Western alliance, gave it to the U.S. Unaware that
Western intelligence had the list, the KGB kept working its way down, stealing technologies from
a host of foreign companies. Once the French gave the list to the CIA, President Reagan gave it
the okay to help the Soviets with their technology needs, with a catch. The CIA started a massive
program to ensure that the Soviets were able to steal the technologies they needed, but the CIA
introduced a series of minor errors into the designs for things like stealth fighters and space
weapons.
Another way of thinking about "cyberwar"
Christopher Ford, The New Atlantis (2010)
When Americans speak of cyberwar, we tend to think of lines of malicious code being sent from
one computer to another, generally via the Internet, in order to cause some kind of mischief: say,
taking down a power grid, or crashing the control systems for an air-defense network. But in fact,
the line between computer-on- computer attack and other forms of electronic assault is quite
fuzzy, and future cyber conflicts between sophisticated players may see wildly different means
and ends that we cannot now predict. An alternative way of discussing cyberwar is in terms not
of technology but of influence. In U.S. military doctrine, information warfare or information
operations (IO) are somewhat separate from cyber conflict. Information operations in time of
conflict include psychological operations, such as deception and perception management;
familiar examples from the twentieth century include dropping leaflets from airplanes, running
strategic misdirection operations, and broadcasting propaganda. Recently, this category has
broadened to include even such activities as giving interviews to the press or writing opinion
pieces for newspaper publication, as well as protection and assurance activities directed at
preserving the integrity and availability of ones own information. In the American understanding
of the terms, therefore, not all IO is cyber in nature, but the two can overlap: cyber attacks can
be used as a tool for accomplishing IO goals. For example, a combatant might hack into an
adversarys systems to plant false data or stories intended to sow fear or confusion.

The Internet as an Accelerant: Unfiltered and Untamed


William Davidow, Overconnected: The Promise and Threat of the Internet (2011)
Of course, the Internet did not cause the global economic crisis; cheap money, lax regulations,
and unchecked avarice did. What the Internet did was act as an accelerant, spreading
information very quickly. It was gasoline on the flames. A crisis of this dimension would not
have been possible without a very efficient, fast, cheap, and reliable information transportation
system. Across the worldwide digital spread, things go viral at lightning speed. And people were
carried away in a competitive, greedy fervor of their own creation.
A world without borders
William Davidow, Overconnected: The Promise and Threat of the Internet (2011)
In an overconnected world, the interdependencies spawned by the Internet let
problems grow and spread so that the span of government controls, of checks and
balances normally built into a system, no longer matches the domain of the problem.
Identity thieves operating in less developed countries and beyond the reach of law enforcement
agencies steal money from citizens in developed ones. Pornographic material that is legal in
California circulates in Tennessee, where it is against the law. Internet gambling casinos, legal in
Great Britain, collect wagers from Texas, where the practice is illegal.
A sad fact of life: Its actually smart to be mean online
Clive Thompson, WIRED (2014)
I'm generally upbeat on Twitter. Many of my posts are enthusiastic blurts about science or
research in which I use way too many exclamation points!! But I've noticed something:
When I post an acerbic or cranky tweet, it gets recirculated far more widely than
do my cheerier notes. People like it fine when I'm genial, but when I make a caustic joke
or cutting comment? Social media gold. This is pure anecdata, of course. Still, it made me
wonder if there was any psychological machinery at work here. Is there a reason that purselipped opinions would outcompete generous ones?
Indeed, there is. It's called hypercriticism. When we hear negative statements, we think
they're inherently more intelligent than positive ones. Teresa Amabile, director of research
for Harvard Business School, began exploring this back in the 1980s. She took a group of 55
students, roughly half men, half women, and showed them excerpts from two book reviews
printed in an issue of The New York Times. The same reviewer wrote both, but Amabile
anonymized them and tweaked the language to produce two versions of eachone
positive, one negative. Then she asked the students to evaluate the reviewer's intelligence.
The verdict was clear: The students thought the negative author was smarter
than the positive oneby a lot, Amabile tells me. Most said the nastier critic
was more competent. Granted, being negative wasn't all upsidethey also
rated the harsh reviewer as less warm and more cruel, not as nice, she says.
But definitely smarter. Like my mordant tweets, presumably.
This so-called negativity bias works both ways, it seems. Other studies show that when we
seek to impress someone with our massive gray matter, we spout sour and negative
opinions. In a follow-up experiment, Bryan Gibson, a psychologist at Central Michigan
University, took a group of 117 students (about two-thirds female) and had them watch a
short movie and write a review that they would then show to a partner. Gibson's team told
some of the reviewers to try to make their partner feel warmly toward them; others were
told to try to appear smart. You guessed it: Those who were trying to seem brainy went
significantly more negative than those trying to be endearing.
Why does this bias exist? No one really knows, though some theorists speculate it's
evolutionary. In the ancestral environment, focusing on bad news helped you survive.
Like I said, this is based on anecdataand you can't easily generalize about why things go
viral in the roiling, wine-dark sea of social media. Some utterly saccharine posts get wildly
liked; certain smartly critical thoughts are loathed. (Compare the rollicking success of the

feel-good site Upworthy to the abuse directed at women and minorities who write intelligent
criticism.) And what's negative? Is a manifesto for social change negative because it
criticizes the status quo or positive because it's idealistic?
But knowing about negativity bias has made me more skeptical of high-brow punditry that
defaults to dour views. If caustic wit is what garners a person whooping accolades for their
intelligence, surely public intellectuals adjust their approach accordingly.
Gibson told me that his study hadn't been cited or followed up on much by other
researchers. Maybe you weren't negative enough? I asked. He laughed: I guess so.

A Day in the Life of POTUS


Richard Clarke, CyberWar: The Next Threat to National Security (2010)
Obviously, we have not had a full-scale cyber war yet, but we have a good idea what it would
look like if we were on the receiving end. Imagine a day in the near future. You are the Assistant
to the President for Homeland Security and you get a call from the White House Situation Room
as you are packing up to leave the office for the day, at eight p.m. NSA has issued a CRITIC
message, a rare alert that something important has just happened. The one-line message says
only: large scale movement of several different zero day malware programs moving on Internet
in the US, affecting critical infrastructure. The Situation Rooms Senior Duty Officer suggests
that you come down and help him figure out what is going on.
By the time you get to the Situation Room, the Director of the Defense Information Systems
Agency is waiting on the secure phone for you. He has just briefed the Secretary of Defense, who
suggested he call you. The unclassified Department of Defense network known as the NIPRNET is
collapsing. Large-scale routers throughout the network are failing, and constantly rebooting.
Network traffic is essentially halted. As he is telling you this, you can hear someone in the
background trying to get his attention. When the general comes back on the line, he says softly
and without emotion, Now its happening on the SIPRNET and JWICS, too. He means that DoDs
classified networks are grinding to a halt.
Unaware of what is happening across the river at the Pentagon, the Undersecretary of Homeland
Security has called the White House, urgently needing to speak to you. FEMA, the Federal
Emergency Management Agency, has told him that two of its regional offices, in Philadelphia and
in Denton, Texas, have reported large refinery fires and explosions in Philadelphia and Houston,
as well as lethal clouds of chlorine gas being released from several chemical plants in New Jersey
and Delaware. He adds that the U.S. Computer Emergency Response Team in Pittsburgh is being
deluged with reports of systems failing, but he hasnt had time to get the details yet.
Before you can ask the Senior Duty Officer where the President is, another officer thrusts a
phone at you. Its the Deputy Secretary of Transportation. Are we under attack? she asks.
When you ask why, she ticks off what has happened. The Federal Aviation Administrations
National Air Traffic Control Center in Herndon, Virginia, has experienced a total collapse of its
systems. The alternate center in Leesburg is in a complete panic because it and several other
regional centers cannot see what aircraft are aloft and are trying to manually identify and
separate hundreds of aircraft. Brickyard, the Indianapolis Center, has already reported a midair
collision of two 737s. I thought it was just an FAA crisis, but then the train wrecks started
happening she explains. The Federal Railroad Administration has been told of major freight
derailments in Long Beach, Norfolk, Chicago, and Kansas City. Looking at the status board for the
location of the President, you see it says only Washington-OTR. He is on an off the record, or
personal, activity outside the White House. Reading your mind, the Senior Duty Officer explains
that the President has taken the First Lady to a hip new restaurant in Georgetown. Then put me
through to the head of his Secret Service detail, says a breathless voice. Its the Secretary of
the Treasury, who has run from his office in the building next to the White House. The Chairman
of the Fed just called. Their data centers and their backups have had some sort of major disaster.
They have lost all their data. Its affecting the data centers at DTCC and SIAC theyre going
down, too. He explains that those initials represent important financial computer centers in New
York. Nobody will know who owns what. The entire financial system will dissolve by morning.
As he says that, your eyes are drawn to a television screen reporting on a derailment on the
Washington
Metro in a tunnel under the Potomac. Another screen shows a raging flame in the Virginia
suburbs where a major gas pipeline has exploded. Then the lights in the Situation Room flicker.
Then they go out. Battery-operated emergency spotlights come on, casting the room in shadows
and bright light. The television flat screens and the computer monitors have gone blank. The
lights flicker again and come back on, as do some of the screens. There is a distant, loud
droning. Its the backup generator, sir, the Duty Officer says. His deputy again hands you a
secure phone and mouths the words you did not want to hear: Its for you. Its POTUS.

The President is in the Beast, his giant armored vehicle that resembles a Cadillac on steroids, on
his way back from the restaurant. The Secret Service pulled him out of the restaurant when the
blackout hit, but they are having a hard time getting through the traffic. Washingtons streets are
filled with car wrecks because the signal lights are all out. POTUS wants to know if its true what
his Secret Service agent told him, that the blackout is covering the entire eastern half of the
country. No, wait, what? Now theyre saying that the Vice Presidents detail says its out where
he is, too. Isnt he in San Francisco today? What time is it there?
You look at your watch. Its now 8:15 p.m. Within a quarter of an hour, 157 major metropolitan
areas have been thrown into knots by a nationwide power blackout hitting during rush hour.
Poison gas clouds are wafting toward Wilmington and Houston. Refineries are burning up oil
supplies in several cities. Subways have crashed in New York, Oakland, Washington, and Los
Angeles. Freight trains have derailed outside major junctions and marshaling yards on four major
railroads. Aircraft are literally falling out of the sky as a result of midair collisions across the
country. Pipelines carrying natural gas to the Northeast have exploded, leaving millions in the
cold. The financial system has also frozen solid because of terabytes of information at data
centers being wiped out. Weather, navigation, and communications satellites are spinning out of
their orbits into space. And the U.S. military is a series of isolated units, struggling to
communicate with each other.
Several thousand Americans have already died, multiples of that number are injured and trying
to get to hospitals. There is more going on, but the people who should be reporting to you cant
get through. In the days ahead, cities will run out of food because of the train-system failures
and the jumbling of data at trucking and distribution centers. Power will not come back up
because nuclear plants have gone into secure lockdown and many conventional plants have had
their generators permanently damaged. High-tension transmission lines on several key routes
have caught fire and melted. Unable to get cash from ATMs or bank branches, some Americans
will begin to loot stores. Police and emergency services will be overwhelmed.
In all the wars America has fought, no nation has ever done this kind of damage to our cities. A
sophisticated cyber war attack by one of several nation-states could do that today, in fifteen
minutes, without a single terrorist or soldier ever appearing in this country.
Shaping the internet age
Bill GatesChairman and Chief Software Architect, Microsoft Corp. (2000)
Opportunities and Challenges. Whenever a new technology emerges with the potential to
change the way people live and work, it sparks lively debate about its impact on our world and
concern over how widely it should be adopted. Some people will view the technology with
tremendous optimism, while others will view it as threatening and disruptive. When the
telephone was first introduced, many critics thought it would disrupt society, dissolve
communities, erode privacy, and encourage selfish, destructive behavior. Others thought the
telephone was a liberating and democratizing force that would create new business
opportunities and bring society closer together.
The Internet brings many of these arguments back to life. Some optimists view the Internet as
humanity's greatest invention--an invention on the scale of the printing press. They believe the
Internet will bring about unprecedented economic and political empowerment, richer
communication between people, a cultural renaissance, and a new era of economic prosperity
and world peace. At the other extreme, pessimists think the Internet will result in economic and
cultural exploitation, the death of privacy, and a decline in values and social standards.
If history is any guide, neither side of these arguments will be proved right. Just as the
telephone, electricity, the automobile, and the airplane shaped our world in the 20th century, the
Internet will shape the early years of the 21st, and it will have a profound--and overwhelmingly
positive--impact on the way we work and live. But it will not change the fundamental aspects of
business and society--companies will still need to make a profit, people will still need their social
framework, education will still require great teachers.

However, the current debate over how widely we should adopt this technology does raise some
serious issues that must be addressed to make the most of the Internet's vast potential.
Protecting intellectual property. The Internet makes it possible to distribute any kind of
digital information, from software to books, music, and video, instantly and at virtually no cost.
The software industry has struggled with piracy since the advent of the personal computer, but
as recent controversy over file-sharing systems such as Napster and Gnutella demonstrates,
piracy is now a serious issue for any individual or business that wants to be compensated for the
works they create. And since the Internet knows no borders, piracy is now a serious global
problem. Strong legislation such as the Digital Millennium Copyright Act (DMCA), cooperation
between nations to ensure strong enforcement of international copyright law and innovative
collaboration between content producers and the technology industry have already made an
impact on addressing this problem. As more and more digital media becomes easy to distribute
over the Internet, the government and private sector must work together to find appropriate
ways to protect the rights of information consumers and producers around the world.
Regulating global commerce. How can we regulate Internet commerce--or should we do it at
all? Because the Internet offers people an easy way to purchase goods and services across state
and national borders--generating tremendous economic growth in the process--it makes global
commerce even more challenging to tax or regulate effectively. But since the Internet's economic
effects result largely from the "friction-free" commerce it enables, any regulation that gets in the
way comes at a price: lost economic growth. As more and more business transactions take place
on the Internet, governments and businesses must cooperate to find innovative ways to regulate
and derive tax revenue from Internet commerce without interfering with the economic benefits it
can provide.
Protecting individual privacy. In the coming years, people will increasingly rely on the
Internet to share sensitive information with trusted parties about their finances, medical history,
personal habits, and buying preferences. At the same time, many will wish to safeguard this
information, and use the Internet anonymously. Although technology has placed individual
privacy at risk for decades--most consumers regularly use credit cards and exchange sensitive
information with merchants over the telephone--privacy will become a far more pressing issue as
the Internet becomes the primary way for people to manage their finances or keep in touch with
their physician. The use of personal information by retailers wishing to provide personalized
service and advertisers that want to target very specific audiences--some of whom have resorted
to gathering information from consumers without notifying them--has greatly increased public
concern over the safety of personal information. It has also left many people reluctant to trust
the Internet with their data.
Keeping the Internet secure. Security has always been a major issue for businesses and
governments that rely on information technology, and it always will be. Much the same is true for
individual security--long before the Internet, people were happily handing their credit cards to
restaurant waiters they had never met before, and that too is unlikely to change. But as our
economy increasingly depends on the Internet, security is of even greater concern. Widely
publicized incidents of Web site hacking, credit card fraud and identity theft have given the
Internet a largely unjustified "Wild West" reputation. In order to keep the Internet a safe place to
do business, software companies have a responsibility to work together to ensure that their
products always offer the highest levels of security. And the judicial system and the law
enforcement community must keep pace with technological advancements and enforce criminal
laws effectively and thoroughly.
Protecting our children. The Internet can revolutionize education, giving children the
opportunity to indulge their intellectual curiosity and explore their world. But while it helps them
to learn about dinosaurs or world history, it can also expose them to obscene, violent or
inappropriate content. And since the Internet is an unregulated global medium, it is hard to
"censor" in any traditional way. The private sector has already made great strides in giving
parents and teachers more control over what children can see and do on the Internet, through
filtering software that blocks access to objectionable Web sites; industry standards such as the
still-evolving Platform for Internet Content Selection (PICS) that enable helpful rating systems;
and Internet Service Providers (ISPs) that voluntarily regulate the activities of their customers.

Government has also played a part, encouraging the growth of the market for child-safety tools,
and increasing law enforcement's role in policing and prosecuting online predators. So far, the
issue of protecting children on the Internet has served as an excellent example of how
governments and the private sector can work together to tackle problems on the Internet.
Bridging the "digital divide." The Internet can empower and enrich the lives of
disadvantaged people around the world--but only if they have access to it. In the 1930s, the
United States government helped bridge the "electrical divide" by forming the Rural
Electrification Administration, which brought power to rural areas that could benefit most from
electrification. Similarly, "universal service" programs have helped some remote areas and
disadvantaged communities have access to inexpensive telephone service. The benefits of
widespread access to the Internet and communications technology are clear enough that
governments now need to decide whether a similar principle should be applied to ensure that
nobody is left behind in the Internet Age.
What is government's role? The Internet is a constantly changing global network that knows no
borders, presenting a unique problem for governments that need to address the many
challenges it presents. In the coming years, governments will have the opportunity to develop
thoughtful and innovative approaches to policies that protect their citizens while nurturing the
openness, flexibility, and economic opportunities that make the Internet such a compelling
technology.
The light hand of government regulation has created an environment that has encouraged the
Internet to flourish, and enabled companies to bring their innovations to consumers at
breathtaking speed. Over the next few years, governments worldwide will find it rewarding to
pursue policies that speed the building of the infrastructure that will make it possible to bring the
benefits of the Internet to more people. This includes finding ways to speed the implementation
of broadband technologies, deregulate where necessary to stimulate competition, resist the
temptation to enact new regulations, and redouble our efforts to protect content on the Internet
by strengthening and enforcing intellectual-property rights.

A world where the market for news, entertainment and information has been
perfected
Cass Sunstein, Republic.com 2.0 (2008)
It is some time in the future. Technology has greatly increased people's ability to filter what
they want to read, see, and hear. With the aid of the Internet, you are able to design your own
newspapers and magazines. You can choose your own programming, with movies, game shows,
sports, shopping, and news of your choice. You mix and match.
You need not come across topics and views that you have not sought out. Without any difficulty,
you are able to see exactly what you want to see, no more and no less. You can easily find out
what people like you tend to like and dislike. You avoid what they dislike. You take a close look
at what they like.
Maybe you want to focus on sports all the time, and to avoid anything dealing with business or
government. It is easy to do exactly that. Maybe you choose replays of your favorite tennis
matches in the early evening, live baseball from New York at night, and professional football on
the weekends. If you hate sports and want to learn about the Middle East in the evening from the
perspective you find most congenial, you can do that too. If you care only about the United
States and want to avoid international issues entirely, you can restrict yourself to material
involving the United States. So too if you care only about Paris, or London, or Chicago, or Berlin,
or Cape Town, or Beijing, or your hometown.
Perhaps you have no interest at all in news. Maybe you find news impossibly boring. If so,
you need not see it at all. Maybe you select programs and stories involving only music and
weather. Or perhaps your interests are more specialized still, concentrating on opera, or
Beethoven, or Bob Dylan, or modern dance, or some subset of one or more of the above. (Maybe
you like early Dylan and hate late Dylan.)

If you are interested in politics, you may want to restrict yourself to certain points of view by
hearing only from people with whom you agree. In designing your preferred newspaper, you
choose among conservatives, moderates, liberals, vegetarians, the religious right, and socialists.
You have your favorite columnists and bloggers; perhaps you want to hear from them and from
no one else. Maybe you know that you have a bias, or at least a distinctive set of tastes, and you
want to hear from people with that bias or that taste. If so, that is entirely feasible. Or perhaps
you are interested in only a few topics. If you believe that the most serious problem is gun
control, or climate change, or terrorism, or ethnic and religious tension, or the latest war, you
might spend most of your time reading about that problemif you wish from the point of view
that you like best.
Of course everyone else has the same freedom that you do. Many people choose to avoid news
altogether. Many people restrict themselves to their own preferred points of viewliberals
watching and reading mostly or only liberals; moderates, moderates; conservatives,
conservatives; neo-Nazis or terrorist sympathizers, Neo-Nazis or terrorist sympathizers. People in
different states and in different countries make predictably different choices. The citizens of Utah
see and hear different topics, and different ideas, from the citizens of Massachusetts. The
citizens of France see and hear entirely different perspectives from the citizens of China and the
United States. And because it is so easy to learn about the choices of people like you,
countless people make the same choices that are made by
others like them.
The resulting divisions run along many linesof religion, ethnicity, nationality, wealth, age,
political conviction, and more. People who consider themselves left-of-center make very different
selections from those made by people who consider themselves right-of-center. Most whites
avoid news and entertainment options designed for African Americans. Many African Americans
focus largely on options specifically designed for them. So too with Hispanics. With the reduced
importance of the general-interest magazine and newspaper and the flowering of individual
programming design, different groups make fundamentally different choices.
The market for news, entertainment, and information has finally been perfected. Consumers are
able to see exactly what they want. When the power to filter is unlimited, people can decide, in
advance and with perfect accuracy, what they will and will not encounter. They can design
something very much like a communications universe of their own choosing. And if they have
trouble designing it, it can be designed for them, again with perfect accuracy.

You might also like