Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Curriculum: Understanding YouTube & Digital Citizenship

Lesson 6
Overview
We have devised an interactive curriculum aimed to support teachers of secondary
students (approximately ages 13-17). The curriculum helps educate students on topics
like:

YouTubes policies
How to report content on YouTube
How to protect their privacy online
How to be responsible YouTube community members
How to be responsible digital citizens

We hope that students and educators gain useful skills and a holistic understanding
about responsible digital citizenship, not only on YouTube, but in all online activity.
Lessons
Below is a list of lessons, and the recommended flow for delivery. Lessons are designed
to fit within 50 minute classes, but can be adapted to fit your schedule:
1. What Makes YouTube Unique - basic facts and figures (40 minutes) -
Teachers
Guide
, Slides
2. Detecting Lies - (35 minutes) -
Teachers Guide

, Slides
3. Safety Mode - (5 minutes) -
Teachers Guide

, Slides
4. Online Reputation and Cyberbullying - (45 minutes) -
Teachers Guide

, Slides
5. Policy - The Community Guidelines (30 minutes) -
Teachers Guide,

Slides
6. Reporting content - Flagging
(20 minutes) -
Teachers Guide

, Slides
7. Privacy part 1 - (40 minutes) -
Teachers Guide,

Slides
8. Privacy part 2 - (50 minutes) -
Teachers Guide

, Slides
9. Copyright - (40 mins) -
Teachers Guide

, Slides
10. Additional resources/Appendix including parent resources -
Teachers Guide
,
Slides

You can download


the full Teachers Guide here

Lesson 6. Reporting - Flagging


Time
20 minutes
Lesson objective
Understand the flagging process, when to flag content on YouTube and how
YouTube enforces policy.
Standards/competencies
Utilising reporting mechanisms on YouTube
Materials/preparation
Videos loaded
Projector
Instruction
In addition to having guidelines for the community (lesson 5) there are also
processes in place for reporting inappropriate content.
The next three lessons are specific to reporting content on YouTube. We realise
that not everyone knows how to report content in the most effective way, so we
want to review the reporting flows and hopefully answer some common questions
along the way.
First up is flagging. Page 16 (or page 1 of the individual lesson) is designed to give
an explanation of what flagging is and to give a visual representation of what it
looks like when you try to flag a video on YouTube.
What is flagging?
If a user feels that a video contains inappropriate content and violates the
Community Guidelines, they can flag or report it and alert YouTube. Flagging
empowers the YouTube community to bring potential violations of our Community
Guidelines to our attention in the most efficient manner. Flagged videos are
reviewed by the YouTube team 24 hours a day, 7 days a week.

Demonstration
Explain to students how people can flag:
1. You need to be signed in to flag
2. The flag icon is displayed under the play bar of each video
3. Once you select the flag a drop down menu will appear to help guide users
on reasons they may wish to select to flag the video e.g. violent or repulsive
content.
4. Under most of these categories, there are further sub-categories that the
user can choose in order to be even more precise, or they can flag simply
under the first drop down menu.
5. Once you select your reason, a note appears saying thank-you and that your
flag has been submitted for review. This means that you have submitted
correctly.
6. Just because something is flagged does not mean it will automatically be
removed from the site. The YouTube team review flagged videos 24 hours a
day 7 days a week and only once a video is reviewed will a decision be made
as to whether it should be removed from the site or not.
7. When a flagged video is reviewed by the YouTube team, they check the video
for any violations of the Community Guidelines e.g. nudity, violence, hate
speech.
8. The uploader will not know if you flag their video. A common misconception!
9. Note that the last option infringes my rights is not a flag itself but instead
redirects users to report issues like copyright and privacy via our additional
reporting flows in the Help Centre.
Activity 3 minutes
We recommend that you ask a class volunteer to flag a video in front of the class.
(Dont worry if the video you flag has no obvious violation, it simply wont come
down when it is reviewed!).
Try flagging this one as an example, it is linked to directly in the Activity box under
video-
http://www.youtube.com/watch?v=XbqzgDnfMsE
Page 17 (or page 2 of the individual lesson) features some definitions for our
flagging categories. You can use these as a reference point to answer the four
questions in the Activity box. It helps to explain what we mean by our flagging
terms.
Activity 7 minutes
Four questions for the class to answer on analyzing content online.
3

Remember to emphasize that this information can be relevant to many other


online interactions. Among the class group, we want the students to think within
the realm of video content about what
context might mean and to try come up with
some examples.
We also think its important for them to think about the difference between hate
speech and harassment, as people can get confused about this from time to time.
Lastly, what do they think the term graphic means?
Notes
1. Its important to note that
context is very important. YouTube takes into
account issues such as whether content is of educational, documentary,
scientific or artistic value. Just because a video may contain some violence, does
not mean that it will automatically come down. Example - the conflicts in parts of
Middle East and North Africa brought a lot of eye witness content straight from
the streets. Citizens became journalists in their own right and were in a position
to share with the rest of the world, the reality of the situation on the ground,
even if some of this was violent and very graphic. We remain a platform for
freedom of expression.
2. Harassment refers to ridiculing or bullying another person e.g. calling them
insulting or hurtful names, deliberately uploading images of them when it is
clear that the intent is to embarrass or hurt them.
Hate speech has a very
specific meaning and refers to hateful content directed towards a protected
group or a member of a protected group. YouTube respects peoples right to
express unpopular or offensive opinions, but we draw the line at hate speech.
Its not OK for instance to demean or insult people because of their race or
sexual orientation.
3. When we refer to
graphic content
, we mean clear or explicit footage. For
instance, focusing or zooming in on injuries or other content likely to disturb
users.
Move on to the next page. Here, you will find a note about
age-restricted content
.
Begin by asking the class how many of them have ever seen the interstitial featured
on the page?
An age-restriction can be applied where a video has been flagged by the
Community and reviewed the the YouTube team and a decision has been made
that even though the content does not violate our Community Guidelines, it may be
4

inappropriate for younger users (under 18). As such, a restriction is placed on the
video and only account holders over the age of 18 have access to watch it.
Page 19 (or page 4 of the individual lesson) shows two visual examples of the type
of content that could fall within this category. Additionally the point on context
from the notes section on flagging is also relevant to consider here. Other examples
of content that may be age-restricted are;
Nudity in an artistic context such as a theatre performance;
Violence that occurs as part of a global event;
Depicting potentially dangerous acts that could be mimicked.

You might also like