Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

The current issue and full text archive of this journal is available on Emerald Insight at:

www.emeraldinsight.com/1363-951X.htm

Protecting
Protecting children from children from
internet pornography? internet
pornography
A critical assessment of statutory
age verification and its
enforcement in the UK Received 17 July 2019
Revised 10 September 2019
Accepted 23 September 2019
Majid Yar
Lancaster University Law School, Lancaster University, Lancaster, UK

Abstract
Purpose – The purpose of this paper is to critically assess the newly created regulatory and policing regime
for age-restricting access to pornography in the UK.
Design/methodology/approach – It examines the pivotal legislation, policy and strategy documents,
consultation submissions and interventions from a range of stakeholders such as children’s charities, content
providers and privacy advocates.
Findings – Even before its implementation, the regulatory regime betrays serious flaws and shortcomings in
its framing and configuration. These difficulties include its inability to significantly curtail minors’ access to
online pornography and risks of privacy violations and associated harms to legitimate users’ interests.
Research limitations/implications – Remedial measures are available so as to address some of the
problems identified. However, it is argued that ultimately the attempt to prohibit minors from accessing such
content is set to fail, and that alternative approaches – such as better equipping children through education to
cope with explicit materials online – need to be given greater prominence.
Originality/value – This paper provides the first criminological policy analysis of this latest attempt to
regulate and police online behaviour, and offers an important critical response to such efforts.
Keywords BBFC, Minors, Age verification, Digital Economy Act (2017), Internet pornography,
Privacy rights
Paper type Research paper

Introduction
Concerns about the harmful impacts of internet pornography – both licit and illicit – have
been and continue to be a significant issue in public consciousness, media coverage, policy
discourse and law enforcement. The policing of the internet – in all its variety, involving a
plurality of public and private actors, organisations and individuals – has thus been
imbricated with the policing of sexual representation and interaction. While the most
obvious instance of this focus has been provided by the online circulation of prohibited
imagery (related to obscenity, sexual violence, child sexual abuse and most recently
non-consensual sharing of intimate media), a secondary concern has coalesced around
minors’ exposure and access to age-restricted sexual content. In other words, there is a
concern about the free online availability and presence of explicit sexual content that, in
other media channels, would not be available to minors – indeed, it would be an offence to
supply such material to those under the age of majority. In order to address this anomaly,
and the concerns it raises for the well-being of children and young people, the UK
government introduced in the 2017 Digital Economy Act (DEA) provisions requiring
commercial pornography websites to institute mandatory and rigorous age-verification
processes, such that no one under the age of 18 will be able to access or view the material
offered by those sites. Alongside this requirement, the Act made provision for the creation of Policing: An International Journal
a statutory “age verification regulator” that would oversee the pornography industry’s © Emerald Publishing Limited
1363-951X
implementation of the new rules, monitor and investigate non-compliance, and deploy a DOI 10.1108/PIJPSM-07-2019-0108
PIJPSM range of punitive sanctions against those content providers that persist in breaking the law.
The measures have elicited controversy and implementation has been twice-delayed, with
the new requirements now set to take effect in early 2020.
This paper critically assesses the regulatory and policing regime for age-restricting access
to pornography in the UK, arguing that (even before its implementation) it betrays serious
flaws and shortcomings in its framing and configuration. The paper is divided into three
sections. The first outlines briefly the perceived problem that is need of address, i.e. the
kinds of harms that are attributed to minors’ exposure to sexually explicit online content. It is
this problem perception that has in significant part driven the political will to legislate on this
issue. The second section maps the ways that the DEA frames a “solution” to the problem, in
the form of age-verification requirements and arrangements for their implementation and
subsequent policing/enforcement. The third section offers a critical appraisal of these
measures, arguing that there are serious problems with the “solution” that they deploy,
difficulties that threaten not only a failure of the regime to actually curtail minors’ access to
the content in question, but also carry notable risks for legitimate users’ privacy rights.

The problem
Recent discussions of sexuality and media have identified what has been called a
“pornification” of culture and everyday life (Paasonen et al., 2007). This entails the increased
production and consumption of pornography; the dispersal of a “porn aesthetic” in
mainstream popular culture, TV, music, fashion, etc.; and the normalisation of pornography
and the attenuation of taboos around it (with the exception of very specific categories of
sexual representation e.g. those depicting/involving minors, those entailing sexual violence,
or those making and/or sharing explicit sexual material without the consent or against the
wishes of those depicted). The internet has been a significant driver or contributor to this
process, with unprecedented amounts and varieties of sexually explicit images and video
recordings becoming readily available to consumers, often free of charge and enjoyable in
private. While extravagant, and widely disparate, claims are often made about just how
prevalent pornography is on the internet, more careful and scholarly estimations suggest
that somewhere between four and 13 per cent of web searches involve pornography, the
range reflecting in part national and regional variations (Ogas and Gaddam, 2012). With
specific reference to children and young people, an estimated 1.4m under-18s in the UK
accessed sites containing pornography using a desktop computer in May 2015 (DCMS, 2018,
p. 5). Across Europe, 20 per cent of those aged 11–16 report having “seen sexual images
online” (Livingstone et al., 2014, p. 20), although other studies suggest significantly higher
rates of exposure. A review of the research literature suggests that while unintentional
“exposure” is more frequent that deliberate “access” to such material, a significant
proportion of those young people engaging with online pornography do so intentionally,
especially amongst males (Horvath et al., 2013, p. 20). Additionally, it suggests that
frequency of exposure/access increases with age, and has become more prevalent over time,
perhaps reflecting the expansion of the internet and its accessibility via a greater range of
channels and devices (Horvath et al., 2013; Flood, 2009, p. 388).
These developments have been associated by many (including policy makers,
legislators and child welfare activists) with a range of harmful and unwelcome outcomes.
These are especially pronounced when the users, viewers or consumers of pornography
are children and young people. The supposed individual and social consequences of such
consumption include:
(1) Children and young people may find exposure to explicit imagery disturbing or
upsetting, especially for those who are ill-equipped by virtue of age to have an
appropriate understanding of sex and sexuality. This may be particularly so when
the acts depicted are “unconventional” or fetishist in character (such as penetration Protecting
of the genitals and anus with various objects, group intercourse, bondage and children from
sadomasochism, urination and so on). Survey studies in a number of countries internet
suggest that at least a significant minority of young people, when exposed to such
content, react with feelings of “shock”, “disgust”, “embarrassment”, “repulsion” and pornography
the like (Flood, 2009, p. 390; Sabina et al., 2008, p. 691).
(2) It has been argued that minors are increasingly using pornography as an ersatz
form of sex education, especially in contexts where cultural and political sensitivities
lead to the absence of formal sex education programmes in schools and other
settings (Zillmann, 2000). Reliance upon pornography in this manner gives young
people distorted understandings of sexual relationships and shapes their
expectations in an unhelpful manner (Wallmyr and Welin, 2006). Thus, exposure
to pornography has been associated with “sexual callousness” and a normalisation
of coercive and aggressive sexual behaviours (especially amongst boys and young
men) (Zillmann, 2000). Conversely, consumption of pornography may generate a
culture of expectations that requires girls and young women in particular to submit
to male expectations of desirable sexual behaviour that are derived from such
scenarios and fantasies (Häggström-Nordin et al., 2006, p. 388).
(3) The kinds of sexual practices depicted in pornography leads young people to pursue
these in their own explorations of sex, with potentially harmful impacts on mental
and physical well-being, e.g. engaging in penetrative intercourse without condoms.
Reviewing research findings about the effects of pornography on sexual behaviour,
Sinković et al. (2013, p. 633) note that use of sexually explicit materials (SEM) “has
been linked to a higher number of sexual partners and substance abuse at sexual
encounters in both adolescent women and men […] and to a lower likelihood of
condom use”.
(4) Exposure to pornography may heighten young people’s insecurities and
dissatisfactions about their bodies, based upon the socially atypical aesthetics and
body contouring associated with the porn aesthetic, e.g. surgical enhancement of
breasts and lips and “toned” fat-free bodies amongst female performers, and
muscularity and large penis size amongst their male counterparts. Horvath et al.
(2013, p. 38) note findings that “young women […] expressed the view that women in
pornography represented the ideal body type and that made them feel unattractive”.
As such pornography may contribute to the broader cultural idealisation of
unhealthy body ideals and self-perceptions, resulting in low self-esteem and
dissatisfaction that have been associated with anxiety, depression and eating
disorders amongst young people (Pinhas et al., 1999; Thomsen et al., 2002).
Of course, these kinds of associations between pornography and harmful social
consequences have been contested by other researchers, who criticise such studies for
reducing the complexity of cultural and sexual experiences in favour of overly simplified
conclusions (Attwood, 2002; McNair, 2014). Thus, for example, Sinković et al.’s (2013, p. 633)
study of 1,000 young people in Croatia found no compelling evidence that “that
pornography use is substantially associated with sexual risk taking among young adults”.
Likewise, having studied 7,500 Swiss youth, Luder et al. (2011, p. 1027) decisively “conclude
that pornography exposure is not associated with risky sexual behaviors and that the
willingness of exposure does not seem to have an impact on risky sexual behaviors among
adolescents”. Less clear-cut, Hald et al.’s (2013, p. 2986) study of Dutch adolescents and
young people did find a relationship between consumption of SEM and sexual behaviour,
but found it to be “small to moderate”, and thus “just one factor among many that may
PIJPSM influence youth sexual behaviors”. Therefore, the perception of harms emanating from the
consumption of pornography is far from unchallenged, and the evidence base remains
contradictory. However, in the present context of discussion, we would do well to recall the
oft-cited “Thomas theorem”, namely, that “if situations are defined as real, they are real in
their consequences” (Smith, 1995). In other words, whether or not such assumptions about
the negative effects of pornography are ultimately sustainable, they are nevertheless now
broadly embedded amongst educators, psychologists, children’s charities and policy
makers, creating the framework within which the “problem of pornography” is understood
and in which societal responses are developed. The legal and law-enforcement initiatives
discussed in the following sections are thus grounded in and supported by the “definition of
the situation” that unequivocally links pornography to harms.
Before we move on the address recent policy developments, it’s worth briefly reviewing
past efforts to manage or curtail young people’s access to such materials. Previous
initiatives have largely followed one of two conjoined strategies. The first has been the use
education to alert young people to risks and harms of consumption in the hope of
discouraging deliberate engagement with explicit content. These efforts have often been
situated within the broader framework of e-safety education that seeks to help safeguard
children from online victimisation, both sexual and otherwise (Atkinson et al., 2009). Yet,
research suggests that these steps have only a limited effect on young people’s behaviour
(Vanderhoven et al., 2015). The second approach has focussed on the use of filtering
software to control minor’ access to explicit online material. Typically, such tools are
initiated and overseen by parents or other responsible adults (such as teachers in school
setting), and enable them to set blocks on access to not just SEM, but also other kinds of
content that may be deemed unsuitable, such as that related to violence, drug use and
gambling. However, the efficacy of such software is doubted by some observers, and studies
indicate variable levels of capability (Przybylski and Nash, 2018). Moreover, the use of such
software also raises concerns about “over-filtering”, in which benign and non-pornographic
content ends up being blocked, including for example educational materials related to
sexuality and sexual health (Yar and Steinmetz, 2019, pp. 169-170). For all these reasons, the
perceived need for more concerted and statutory measures has driven the kind of measures
discussed below.

The solution to the problem


The concerns highlighted in the preceding section have come to define the legal and public
policy response to online pornography with respect to children and young people. The
challenge, in a nutshell, is this: how to exercise effective control over children and young
people’s exposure/access to materials that are legally restricted to adults. This already
happens with regulating access to films, video games, magazines, DVDs and other media
content, according to a system of age-based classification. Thus, in the UK, the British Board
of Film Classification (an NGO founded by the film industry in 1912) fulfils a statutory
function in classifying theatrically exhibited films and video recordings released on physical
media (the classification of video game content is now predominantly the responsibility of the
Video Standards Council) (Brett, 2017; BBC News, 2012). The BBFC authorises who can view
or purchase the material according to an age-based certification system, ranging from
“Universal” (available for all) at one end, to “Restricted 18” at the other ( for “hard-core”
pornographic content that is only accessible for over-18s via exhibition in a licensed adult
cinema or purchase at a licenced sex shop, and not available for purchase via mail order)
(BBFC, 2019a). The rationale for classification and restriction on the basis of the age of
prospective viewers/consumers is founded on the goal of protecting young people and other
vulnerable individuals from content that is deemed to be harmful for them (Oswell, 2008,
p. 475)[1]. Under the provisions of the Video Recordings Act 1984 (amended in 2010),
“a person who supplies or offers to supply [such a video recording] to a person who has not Protecting
attained the age so specified is guilty of an offence”, and faces either a fine or a custodial children from
sentence of up to 6 months (VRA, 1984). Hence the aim of recent policy developments has been internet
to equalise the situation with regard to internet pornography, such that material which would
be age-restricted if purchased in another format (e.g. print, audio-visual recording on DVD, pornography
etc.) is likewise subject to age-based restrictions and checks when made available online.
The UK government’s solution to the challenge of regulating young people’s access to
internet pornography was introduced in the DEA 2017. The Act created, first, a requirement
that all pornography websites operating “on a commercial basis” take steps to ensure users are
over 18 years of age. Second, the Act established an “age verification regulator” that would
exercise statutory oversight over websites’ compliance with its provisions; the BBFC has been
selected to fulfil this responsibility, an extension of its age-related media regulatory roll to
embrace online media. The role of the regulator comprises three elements: establishing
guidelines about what kinds of measures websites would need to implement so as to satisfy
the age-verification requirements; establishing the criteria upon which it deems a website or
content provider to be “commercial” in character, and thereby subject to the mandatory
requirements; and taking enforcement measures (including punitive sanctions) against those
who are judged to be non-compliant (Bedlow, 2017, p. 2). The contours of the pending
regulatory regime, across all three of these dimensions, will be considered in some detail below.
The Act requires that commercial purveyors of online pornography need to establish
mechanisms that serve to confirm that users are in fact over the age of 18. There are a range
of online identity- and age-verification technologies that have existed for some time, often
linked to either commercial transactions that entail electronic transfer of monies, or access to
adult-oriented content and services (in addition to pornography, it may also be applied to
regulate access to violent media and tobacco and alcohol advertising). The most basic form
of age verification is that which simply asks users to confirm that they are over a particular
age (e.g. “click to confirm that you are over 18 years of age”) or enter their date of birth.
However, the glaringly obvious problems with this kind of “self-certification” (or so-called
“honour systems” or “trust systems” – Jøsang et al., 2007) is that there are no independent
checks made as to whether the user is in fact telling the truth about their purported age.
Hence, one alternative method for age-verification is to link it to a form of identity
authentication that is already age-restricted. The most obvious method is to use credit card
details as a form of age verification, as such credit-based financial services are normally
only available to over-18s. However, such systems do not preclude situations in which
minors may use a parent’s or other family member’s credit card details, or instances in
which fraudulently obtained credit card details are made available by hackers and others.
Additionally, relying upon possession of a credit card to serve as a gatekeeping mechanism
for access to online goods and services risks excluding the economically disadvantaged
whose poor credit ratings or strained financial circumstances preclude them from obtaining
credit-based financial services. Finally, the requirement for pornography consumers to
make their identifies and related personal details know to pornography providers raises
serious concerns about privacy and the potential for misuse of data, e.g. the ability to link
individuals to the consumption of, or interests in, particular kinds of content, some of which
may entail tastes deemed “deviant” or otherwise problematic should they become publicly
known. Such profiles could, in principle, be shared with a wide range of third parties for the
purposes of targeted advertising, as is now commonplace with other types of web search
and surfing activities. Furthermore, there have been a number of cases where data breaches
(through hacking and malware for example) have resulted in users’ pornography
consumption patterns being used for blackmail (Dearden, 2019).
The DEA proposes age verification via third-party services rather than directly by
content providers, in an attempt to address privacy concerns – namely that the sexual
PIJPSM interests and viewing behaviours of identifiable consumers could be tracked and recorded
by websites, creating risks of: misuse of such data, e.g. by sharing with or sale to third
parties; and the theft or dissemination of such data through data breaches or hacks.
Third-party age verification is intended to prevent websites hosting pornographic content
from ascertaining the identity or other personal details of visitors, while nevertheless
ensuring that those accessing the material in question actually satisfy the mandatory age
restriction. Third-party verification works by requiring users to create (and pay for) an
account that issues them with a password; this password is then used to access
age-restricted sites and services, but without disclosing anything other than the password
to those sites. The creation of such an account requires the user to provide proof of age via a
number of options, including for example credit card details, passport or driving licence.
However, and more contentiously, one age-verification provider, Yoti, proposes also using
its so-called “Age Scan” software, which it claims can accurately determine users’ age from a
selfie that is submitted each time an age-restricted site is visited (Gallagher, 2019). For those
preferring to acquire a password offline, they will be able to purchase a “voucher” at
participating convenience stores, again upon presentation of an appropriate form of ID; the
voucher contains an activation code which is used to generate a password (Burgess, 2019a).
For those purchasing a password online, users are reassured by providers such as
AgeChecked that that once the process is complete, their personal data will be deleted,
thereby severing the link between the password and the personal details of its owner
(AgeChecked, 2019). Others, such as Yoti, retain users’ personal information (such as name
and date of birth), but offer the assurance that these details will be protected by “AES 256
bit data encryption[2]” and stored in “secure UK Tier 3 data centres[3]” (Yoti, 2019).
Providers of pornographic content must enact such verification systems and gain
certification from the official age-verification regulator.
Within the broad verification parameters set-out above, it will be the role of the regulator
to establish whether a website or content provider is to be deemed “commercial” in
character, and thereby subject to the age-related mandatory requirements. The criteria for
such classification are set-out, supplementary to the DEA, in “The Online Pornography
(Commercial Basis) Regulations 2018”. In the regulations, “Pornographic material […] is to
be regarded as made available on the internet to persons in the United Kingdom on a
commercial basis” if it meets one of two requirements: “if access to that pornographic
material is available only upon payment”; or “if the pornographic material is made available
free of charge and the person who makes it available receives (or reasonably expects to
receive) a payment, reward or other benefit in connection with making it available on the
internet[4]”. The latter provision is clearly meant to ensure that the verification
requirements cover those many sites that offer content for free, and generate revenues by:
selling advertising space to other content providers or providers of sexual services (such as
live streaming and interactive sexual contact via webcams, online dating or escort services);
and/or adopt a “freemium” model, in which free-to-access content is limited to a sample
(e.g. an extract from a film or “scene”, or a full scene offered in lower-definition), with the full/
highest quality content reserved for those willing to pay (Holmes, 2015). However, just as
noteworthy are the kinds of commercial offerings that are exempted from the regulations,
e.g. the definition of “pornography made available on a commercial basis” “does not apply in
a case where it is reasonable for the age-verification regulator to assume that pornographic
material makes up less than one-third of the content of the material made available on or via
the internet site or other means”. The issues of exceptions and gaps in the range of sites to
which the verification requirements will apply are considered in some detail in the final
section of the paper.
The third, policing-related, element of the regulator’s role is that of enforcement:
identifying those pornography providers who have failed to adequately implement
age-verification measures, acting to compel concordance with the requirements and taking Protecting
punitive action against those sites which fail to take appropriate corrective measures. The children from
first element of this role entails investigating content providers so as to ascertain internet
compliance/non-compliance (BBFC, 2018a, p. 5). In terms of its modus operandum, the BBFC
states that it “will primarily investigate sites with high volumes of traffic, but also carry out pornography
spot checks on less visited sites. Additionally, we’ll consider investigating sites that are
reported to us by charities, stakeholders and individuals” (BBFC, 2019b). Although this has
not been made explicit, it is reasonable to assume that a commercial provider will be found
non-compliant either because it has: failed to institute age-verification measures; or
instituted measures that fail to meet the required standards in terms of effectiveness in
excluding minors or maintaining privacy and data protection rules. In cases of a
“determination of non-compliance” the regulator will contact the content provider so as to
“specify a prompt timeframe for compliance”, and also potentially set-out the specific steps
that would need to be taken so as to ensure that compliance. In a graduated series of
enforcement measures, failure to take remedial measures will result in the issue of an
“enforcement notice” which then makes available to the regulator a number of powers.
These include crucially the ability to contact: internet service providers (ISPs) who provide
users access to the website or platform in question; providers of payment services used by
the website or platform in instances where it charges users directly for access to content;
and a range of other “ancillary service providers” including:
• “search engines which facilitate access to non-compliant services”;
• “providers of IT services to a non-compliant person”;
• “third parties who provide advertising space to the non-compliant person”;
• “third parties who provide advertising space on a website, app or other service
belonging to a non-complaint person”; and
• “third parties advertising on or via any internet site operated by the non-compliant
person” (BBFC, 2018a, pp. 9-10).
The regulator will be empowered to not only inform these actors that the content provider in
question is non-compliant with the statutory requirement under the DEA, but also to
request that service providers of various kinds withdraw that service, e.g. for ISPs to block
access to non-compliant sites; for payment service providers to stop processing payments
for the website; for search engine provides to remove the offending sites from search results,
and so on. The duty of the service provider to comply with the regulator’s requirements will
be enforceable through civil proceedings (DEA, 2017, s. 23). Taken together, and if used to
their full potential, these enforcement measures could seriously hamper commercial content
providers’ ability to do business, impact negatively on visitors and hence revenues, and in
extremis remove their ability to function in the UK market.
If we assess the policy in the frame of criminological theory, it is clear that assumptions
about rational action and associated “routine activities” underpin the approach, and the
required measures clearly conform to the edicts of situational crime prevention (SCP)
(Newman et al., 2016; Clarke and Felson, 2017). For example, the policy presupposes that
both consumers and producers/distributors of the content are utility-maximising actors
intent upon satisfying their respective desires (sexual gratification for consumers, and
subscription and advertising-based revenues for producers/hosts). And further that
calculation of costs and benefits will decisively shape behaviour, e.g. if the cost – in terms of
fines or other punitive consequences – outweigh any benefits from allowing minors
unrestricted access to content, then providers will be clearly incentivized to instigate
rigorous age-verification measures. The preventive measures (to curtail young people’s
PIJPSM ability to access the material in question) follow a classic SCP formula of “target-hardening”
(securing the content behind digital locks) and “capable guardianship” (age verification
providers and the regulator monitor online behaviour so as to exercise control and
discourage law- and rule-breaking conduct). In this regard, the policy is very much of a piece
with established crime prevention strategy.

The problem with the solution


Having set-out the “problem” of minors’ access to pornography (as it has been framed in
public policy discussion), and elaborated the “solution” as instigated under the provisions of
the DEA 2017, this section turns to consider the “problem with the solution” from a number
of perspectives. Elaborated here are a number of flaws, problems, and limitations with the
law and its regulatory/enforcement framework that, I argue, go a significant way to
undermining the provisions’ ability to achieve their stated ends, and to do so in a manner
which properly answers concerns about legitimate users’ privacy and protection from
potential victimisation.
The first and most obvious shortcoming of the framework set in place by the DEA 2017
is the fact that its mandatory age-verification requirements fail to cover a significant
proportion of the online pornographic content that is presently available to web users in the
UK. The first such shortcoming relates to the exception that has already been noted, namely,
that the requirements do not apply to sites on which “pornographic material makes up less
than one-third of the content of the material made available”. This means that sites
containing a mix of pornographic and non-pornographic content – the latter including erotic
or sexually oriented material that nevertheless falls short of the Act’s definition of
“pornography” – may well be exempted, and so remain freely accessible without any age
verification[5]. However, this lacuna pales into insignificance in comparison to the limited
applicability of the regulations resulting from the fact that the definition of “pornographic
material made available on a commercial basis” does not include user-generated or
user-posted content shared via social media platforms, blogs or similar content-sharing
channels. While such platforms and services are often commercial in nature (typically
driven by targeted advertising and/or monetisation of user information and consumption
patterns – Liang and Turban, 2011), if the platforms themselves are not sharing the content,
then the regulations will not apply. That is to say, if the user is sharing the material on a
not-for-profit basis then it does not count as a “commercial” practice, and therefore the
age-verification requirements are not deployed. While some popular social media
platforms – such as Facebook, Instagram and Tumblr – exclude sharing of sexually
explicit content as part of their conditions of use and “community standards”, many others
do not (Yar, 2018, p. 11; Romano, 2018). Thus, while many now prohibit for example
non-consensual sharing of intimate images (so-called “revenge porn”), sexually explicit
content as such is nevertheless allowed on the likes of Twitter, WhatsApp and Reddit (Cole,
2017). Indeed, following the exclusion of pornographic material from platforms such as
Tumblr, there has been a proliferation of alternatives – such as Pillowfort, Dreamwidth,
Mastodon, Newgrounds and New.Tumbl – that explicitly champion freedom when it comes
to sharing sexual content (Chase, 2019). All of these platforms – and numerous others – are
accessible for users without any meaningful form of age or identity verification.
Additionally, substantial volumes of pornography are available via Torrent sites that
enable (and indeed champion) anonymous access (Burns, 2016). In other words, minors can
access – and unwittingly be exposed to – huge quantities of pornographic material on
channels other than those regulated by the age-verification requirements of the DEA.
Turning now to the “commercial” providers of pornographic content that are to be subject
to the requirements, there are again obvious shortcomings to the regulatory framework. First,
there are various ways in which the age-verification process can be manipulated. For example,
individuals (including minors) can use age-confirming IDs belonging to others so as to acquire Protecting
a password. Moreover, as journalists from the Guardian newspaper revealed, the children from
age-verification procedures “can be circumvented in minutes”, by using an e-mail address internet
and “a non-existent credit card number” (Waterson, 2019). However, users wishing to bypass
the age-verification requirement have a much simpler and more straightforward option. The pornography
DEA’s requirements apply only to those accessing the pornographic content from the UK, and
users of the same sites and services from elsewhere in the world will remain unaffected and
able to view the material as at present, without checks. The sites will determine whether a
visitor is from the UK from their IP address, which enables geolocation of the request’s
country of origin (similar checks on a visitor’s location are routinely used by mainstream
media channels to confine content access to those in the country in which it has been licenced
for sharing, or by online retailers such as Amazon to direct visitors to an appropriate
country-specific version of their site featuring for example the local language, currency and
available stock) . Therefore, any UK user wishing to avoid the need to comply with
age-verification simply needs to access the sites using an IP address from outside the UK, i.e.
from a country to which the regulations do not apply. This facility is easily available by using
a virtual private network (VPN) that allows users to select IP addresses from various countries
when accessing the internet – an appealing option for those wishing to better preserve their
anonymity, to bypass the kinds of aforementioned “geoblocking” of media content, or to
access sites which have been blocked in their own country (e.g. UK ISPs block access to
numerous sites associated with the distribution of “pirated” media, but this is routinely
circumvented by British users through the use of a VPN – Yar and Steinmetz, 2019, p. 114). In
2018, it was estimated that some 25 per cent of web users engaged a VPN while browsing at
least once in the preceding 30 days, giving some indication of the tool’s growing popularity
(Go-Globe, 2019). VPN software is readily available for download and, following the
previously mentioned “freemium” model, can provide free use subject to restrictions,
with more advanced features being reserved for subscribers. Indeed, one free web
browser – Opera – includes built-in VPN functionality that enables users to access the internet
from a menu of IP addresses in Europe, the Americas and Asia. Additionally, users may make
recourse to web proxies and fast flux networks to achieve similar results. In other words, as
one leading tech journalist concludes: “The porn block isn’t infallible. In fact, the way that it
has been implemented means it really doesn’t take much technical knowledge to get around at
all. It’s so easy that the children the law is meant to protect won’t find any difficulties in
subverting its purpose” (Burgess, 2019b). However, it is worth noting that the problem related
to VPN use only holds in cases those who deliberately seek out such content, rather than those
who are accidentally exposed to pornography.
In the discussion thus far, I have suggested that the DEA’s measures for controlling
age-based access to pornography are poorly crafted and, even before they have been
implemented, it is clear that they will fail to a significant degree in achieving the policy’s
stipulated goals. However, implementation of the measures also poses significant risks to
the privacy and interests of legitimate consumers of pornography. At the first level, the
sheer proliferation of different age-verification services has created a situation in which
different content providers are opting for different services, meaning that users will not be
able to use a single (or even a couple of ) verification accounts to access content across
multiple sites. This would also require each consumer to make multiple payments for the
various verification accounts, thereby placing a financial burden upon the economically
disadvantaged. One executive from a large pornography business likens this situation to
“pre-EU Europe, with everyone using a different currency and border checks everywhere”
(quoted in Burgess, 2019c). This issue, combined with an understandable reluctance on the
part of many users to their supply personal details for age verification, may well simply
encourage greater numbers of consumers to access instead pirated pornographic content via
PIJPSM other, unregulated, channels; in other words, a classic case of “displacement” that is a
familiar problem in criminology and crime prevention (Burgess, 2019c; Gabor, 1981). Indeed,
the UK Government’s Depart of Media Culture and Sport, in its impact assessment of the
requirements, similarly noted that “Adults (and some children) may be pushed towards
using Tor (dark web) and related systems to avoid AV [age verification] where they could be
exposed to illegal and extreme material that they otherwise would never have come into
contact with” (DCMS, 2018, p. 15). In effect, the attempt at harm reduction may instead
inadvertently became a source of harm proliferation. Second, and as noted earlier, a major
driver for utilising third-party age verification services is to preserve privacy and
anonymity for consumers by: processing and storing personal data securely; and
maintaining a clear separation between those holding consumers’ personally identifying
information and the pornography sites who can track users’ browsing, searches and
consumption habits. In principle, this arrangement should ensure that: the fact that
individuals chose to create an age-verification account for the purposes of consuming
pornography remains private; and pornography sites are unable to connect identifiable
individuals to their sexual interests and preferences, as indicated by their choice of viewing
content. However, there are reasons for scepticism about the reliability and efficacy of the
proposed arrangements in respect of both aims. With respect to the first issue, that of
privacy, there are no shortage of examples of breaches and failures in data security that
have led to the leaking of users’ identities and personal information, sometimes related to
sensitive details about their sexual activities. The most widely known such instance is that
related to Ashley Madison, a Canadian company offering an online contact service for those
seeking extramarital affairs (their slogan: “Life is short. Have an affair”). In 2015, a group
calling themselves The Impact Team hacked personal details of the site’s users/subscribers,
and leaked them online in a number of “data dumps” (Thomsen, 2015). While encrypting
such data can certainly help mitigate against threats posed by external intrusion, it does
little to nullify the dangers of either data leaks through internal incompetence or malicious
intent from organisation insiders. The so-called “insider threat” in regard to data breaches
has been estimated to account for anywhere between 43 and 90 per cent of information
security incidents within businesses and other large organisations (Wall, 2013, p. 107).
Third, concerns have been raised by campaigners such as the Open Rights Group (ORG)
about the privacy standards that are likely to be instituted and maintained by verification
providers. They note, for example, that the regulator has been “relying […] on a commercial
provider to draft the contents with providers, but without wider feedback from privacy
experts and people who are concerned about users” (Killock, 2019). The BBFC provides only
“guidelines” that verification providers will be expected to interpret and apply according to
undefined “industry standards”, and those guidelines have been judged to be “vague and
imprecise” as regards security, encryption, pseudonymisation and data retention (Open
Rights Group, 2019, p. 2). Consequently, the ORG concludes that the arrangements for data
protection constitute a “privacy timebomb”. Fourthly, and finally, we can note legitimate
concerns about the likely effectiveness of the “firewall” that will supposedly exist between
age verification providers on the one hand, and pornographic content providers on the other.
For example, the dominant global player in internet pornography at present is MindGeek, a
company that owns a huge swathe of the most popular pornography sites, including
Pornhub, Youporn, Redtube, tube8 and xtube. Pornhub alone accounts for more than 100bn
video views per year, and its parent company reports annual revenues of $460m (Brenton,
2018). MindGeek’s chosen age verification partner across all its services will be AgeID
(www.ageid.com/) – which happens to be wholly owned by MindGeek itself (this
relationship is nowhere disclosed on its website, including in its sections covering “About”,
“Privacy Police”, “Terms of Service” and “FAQs”). In effect, a single commercial entity, with
a dominant position in the online pornography market, will under these arrangements be
collecting and holding data on both users’ identities and related personal details and their Protecting
content consumption patterns across multiple websites. The potential for breaches, leaks, children from
hacks, and misuse of sensitive data is, under these circumstances, significantly heightened. internet
The shortcomings and limitations identified here present a significant challenge for the
proposed regulatory and enforcement regime for age verification. Some of these are pornography
amenable to rectification through revision of the arrangements. First, for example,
enhanced and more rigorous checks on identification documents can minimise the
possibilities of illicit acquisition of age verification passwords. Second, creation and
enforcement of more rigorous privacy requirements for verification providers, alongside
prohibition of the monetisation of user data, can go some distance to answering the
misgivings of privacy activists. Third, insisting on a strict separation between verification
providers and pornographic content providers can help avoid conflicts of interest and
minimise the chances of user data abuse. Fourth, with regard to by-passing age
verification through the use of VPNs, it is possible for sites to detect and block content
access for users employing such software (although this in turn would place unwelcome
restrictions on legitimate users rights to privacy and place them at risk of victimisation,
and would be likely resisted vigorously by content providers). However, a fundamental
problem remains, namely that even if implemented effectively the regulatory measures
would only impact a relatively small proportion of the pornographic content available
online. If the aim is to eliminate or significantly reduce minors’ access or exposure to such
material, it is likely to fail.
The forgoing discussion suggests that a prohibitionist approach may not be the most
fruitful strategy, and that young people may be better served by education that equips them
to deal with the kinds of explicit content they are likely to encounter. Previous education-
based strategies (such as the e-safety programmes mentioned earlier) have largely been
based upon the aim of reducing minors’ exposure to the content in question by inculcating
avoidance behaviours, i.e. persuading them not to seek out the material. However, given the
widely acknowledged failure of such approaches to curtail exposure/consumption,
education-based strategies might instead seek to furnish minors with a critical
understanding of the nature of pornographic representations and the difference between
fantastic depictions of sexual scenarios on the one hand, and the realities of human sexual
behaviour and intimacy on the other. The call for the inclusion of so-called “porn literacy”
within schools-based sex education programmes (Hutchings, 2017; Dawson, 2019) may offer
a more viable path forward for addressing concerns about the harmful impacts of internet
pornography on young people.

Conclusion
This paper has offered a critical appraisal of the newly established verification regime in the
UK, including its rationale, configuration, and arrangements for enforcement. The impetus
behind this move – a global first – has emerged from concerns about the negative and
harmful consequences arising from young people’s access and exposure to online
pornography (although the nature and extent of such harms remain contested and unclear).
Previous efforts to address the perceived problem (e.g. through e-safety education and
filtering/content-control) have proved to be of limited effectiveness, thereby necessitating
more directive measures backed-up by statute. While eye-catching and seemingly a
proactive attempt to apply online existing laws prohibiting the supply of age-restricted
material to minors, a closer examination of the measures reveals major flaws. These
flaws – including the limited applicability of the requirements to “commercial” content
providers, the widespread availability of pornographic content through non-restricted
channels, and the ease with which the checks can be bypassed – clearly indicate that the
measures are likely to fail conspicuously in curtaining access to pornography by minors
PIJPSM (and may indeed exacerbate exposure to more extreme and prohibited content by driving
consumers to alternative sources of content). Moreover, this problematic attempt to regulate
and police internet usage carries significant risks to users in terms of the potential for
privacy violations and the leak or misuse of highly sensitive data. The implementation of an
approach that given the appearance of rigorous policing, but falls well short of achieving its
stipulated aims, may give a false sense of security, thereby leaving the potential harms.
Consequently, we can anticipate that it will require either substantial rethinking and
revision, or even wholesale discontinuation, as its limitations and problems become
apparent upon roll-out. It is not entirely clear if the problems identified will lead to further
delays in its implementation and/or abandonment in the face of criticism. In the last
instance, young people will continue to encounter sexually explicit content online, and
alternative, education- and awareness-based initiatives may be a better option to help
minors deal with such (inevitable) exposure. Such initiatives, instead of fighting a losing
battle aimed at preventing young people from encountering explicit content (whether
incidentally or intentionally), would stress “porn literacy” as part of a broader curriculum
for sex education in the digital age.

Notes
1. The same rationale is used for the restriction of purchase of other kinds of (non-media) goods,
including alcohol, cigarettes, offensive weapons (knives, blades), solvents, fireworks, petroleum
and aerosol spray paints. Vendors of all these products are required by law to ensure that
customers are legally entitled by age to purchase these items. Depending upon the category of
product, penalties for non-compliance range from fines, suspension of the license necessary to
retail the products in question, to custodial sentences of up to two years (DBIS, 2014).
2. Advanced Encryption Standard is the most widely used and current standard for encrypting
information and preventing interception or access by unauthorised third parties. It is the
encryption technology commonly relied upon governments and big business (Heron, 2009). In total,
256-bit refers to the length of the encryption key that is needed to decrypt (read) protected
data – the longer the key, the harder it is to “crack”, and the more secure the data will be from
unauthorised access.
3. Data centres – which remotely store users’ data – are classified on a four-fold scale of Tiers (1–4).
The primary difference between the Tiers is the extent of “redundancy” or back-up that is built-in
to the infrastructure so as to mitigate against outages or failure. A Tier 3 data centre is estimated
to offer 99.99 per cent “up time” i.e. in any given year, it will experience only 1.6 h of “downtime” in
which its services are not fully functional (HPE, 2019).
4. See www.legislation.gov.uk/ukdsi/2018/9780111173183
5. For the purposes of the Act, pornography is defined material for which “the video works authority
has issued an R18 certificate” or “any other material if it is reasonable to assume from its nature
that any classification certificate issued in respect of a video work including it would be an R18
certificate” (BBFC, 2018b, p. 16). The “restricted 18” classification is reserved for “works
containing clear images of real sex, strong fetish material, sexually explicit animated images, or
other very strong sexual images”, and a such does not cover “soft core” pornographic material
including images of full nudity, exposed genitals, etc. (BBFC, 2018a, p. 26).

References
AgeChecked (2019), “Age checks for online users”, available at: https://agechecked.com/age-checks-for-
online-users/ (accessed 28 June 2019).
Atkinson, S., Furnell, S. and Phippen, A. (2009), “Securing the next generation: enhancing e-safety
awareness among young people”, Computer Fraud & Security, No. 7, pp. 13-19, available at:
www.sciencedirect.com/journal/computer-fraud-and-security/issues
Attwood, F. (2002), “Reading porn: the paradigm shift in pornography research”, Sexualities, Vol. 5 Protecting
No. 1, pp. 91-105. children from
BBC News (2012), “Video Standards Council to take over games age ratings”, 10 May, available at: internet
www.bbc.co.uk/news/technology-18017385 (accessed 9 May 2019).
pornography
BBFC (2018a), Guidance on Ancillary Service Providers, BBFC, London, p. 18.
BBFC (2018b), Guidance on Age-Verification Arrangements, BBFC, London, p. 24.
BBFC (2019a), Classification Guidelines, BBFC, London, p. 40.
BBFC (2019b), “BBFC frequently asked questions”, available at: www.ageverificationregulator.com/
faq/#7 (accessed 10 June 2019).
Bedlow, G. (2017), “The digital economy act 2017 – an overview”, available at: https://drystone.com/
files/3ef0c180c574c2030c35e557dfce0958.pdf (accessed 10 May 2019).
Brenton, H. (2018), “Porn empire reports half billion dollars in revenue – but ends year with loss”,
available at: https://luxtimes.lu/luxembourg/33248-porn-empire-reports-half-billion-dollars-in-
revenue-but-ends-year-with-loss (accessed 4 July 2019).
Brett, L. (2017), “The BBFC and the Apparatus of Censorship”, in Hunter, I.Q., Porter, L. and Smith, J.
(Eds), The Routledge Companion to British Cinema History, Routledge, London, pp. 231-241.
Burgess, M. (2019a), “This is how age verification will work under the UK’s porn law”, available at:
www.wired.co.uk/article/uk-porn-age-verification (accessed 8 June 2019).
Burgess, M. (2019b), “The UK porn block, explained”, available at: www.wired.co.uk/article/porn-block-
uk-wired-explains (accessed 2 July 2019).
Burgess, M. (2019c), “Porn websites are totally confused about the UK’s porn block plans”, available at:
www.wired.co.uk/article/porn-websites-confused-by-uk-porn-law (accessed 3 July 2019).
Burns, S. (2016), “The Tory government’s war on porn is doomed to fail, and here’s why”, available at:
https://arstechnica.com/tech-policy/2016/08/uk-government-porn-age-verification/ (accessed
3 July 2019).
Chase, E. (2019), “13 other places Tumblr fans can go to explore their sexuality”, available at: www.
cosmopolitan.com/sex-love/a25937501/tumblr-porn-ban/ (accessed 2 July 2019).
Clarke, R.V. and Felson, M. (2017), “Introduction: Criminology, routine activity, and rational choice”, in
Lambert, R.D. (Ed.), Routine Activity and Rational Choice, Routledge, London, pp. 1-14.
Cole, S. (2017), “Porn is still allowed on Twitter”, available at: www.vice.com/en_us/article/xwavkq/porn-
allowed-on-twitter-adult-content-ban-new-guidelines-pornhub-patreon (accessed 2 July 2019).
Dawson, K. (2019), “Educating Ireland: promoting porn literacy among parents and children”, Porn
Studies, Vol. 6 No. 2, pp. 268-271.
DBIS (2014), Age Restricted Products and Services: A Code of Practice for Regulatory Delivery,
Department for Business Information and Skills, London, p. 32.
DCMS (2018), Impact Assessment (IA): Age Verification for Pornographic Material Online, DCMS, London.
DEA (2017), “Digital Economy Act 2017”, available at: www.legislation.gov.uk/ukpga/2017/30/contents/
enacted (accessed 19 June 2019).
Dearden, L. (2019), “Hacker who blackmailed porn users into handing him money after they clicked on
his pop-up adverts jailed”, available at: www.independent.co.uk/news/uk/crime/porn-hacker-
blackmail-zain-qaiser-trial-prison-sentence-a8861236.html (accessed 5 June 2019).
Flood, M. (2009), “The harms of pornography exposure among children and young people”, Child Abuse
Review, Vol. 18 No. 6, pp. 384-400.
Gabor, T. (1981), “The crime displacement hypothesis: an empirical examination”, Crime &
Delinquency, Vol. 27 No. 3, pp. 390-404.
Gallagher, S. (2019), “Porn block: you’ll soon be able to verify your age with a selfie”, available at: www.
huffingtonpost.co.uk/entry/porn-block-verify-your-age-with-a-selfie_uk (accessed 30 June 2019).
PIJPSM Go-Globe (2019), “The state of VPN usage – statistics and trends”, available at: www.go-globe.com/
vpn-usage-statistics/ (accessed 3 July 2019).
Häggström‐Nordin, E., Sandberg, J., Hanson, U. and Tydén, T. (2006), “ ‘It’s everywhere!’ Young
Swedish people’s thoughts and reflections about pornography”, Scandinavian Journal of Caring
Sciences, Vol. 20 No. 4, pp. 386-393.
Hald, G.M., Kuyper, L., Adam, P.C. and de Wit, J.B. (2013), “Does viewing explain doing? Assessing
the association between sexually explicit materials use and sexual behaviors in a large sample
of Dutch Adolescents and Young Adults”, The Journal of Sexual Medicine, Vol. 10 No. 12,
pp. 2986-2995.
Heron, S. (2009), “Advanced encryption standard (AES)”, Network Security, No. 12, pp. 8-12, available at:
www.sciencedirect.com/journal/network-security/issues
Holmes, D. (2015), “Pornhub launches ‘Netflix for porn’ subscription service”, available at: www.
theguardian.com/culture/2015/aug/06/pornhub-launches-paid-subscription/ (accessed 1 July 2019).
Horvath, M., Alys, L., Massey, K., Pina, A., Scally, M. and Adler, J.R. (2013), “Basically…porn is
everywhere”. A Rapid Evidence Assessment on the Effects that Access and Exposure to Pornography
has on Children and Young People, Office of the Children’s Commissioner, London, p. 88.
HPE (2019), “What are data centre tiers?”, available at: www.hpe.com/uk/en/what-is/data-center-tiers.
html# (accessed 30 June 2019).
Hutchings, N. (2017), “Porn literacy: raising sexually intelligent young people”, The Journal of Sexual
Medicine, Vol. 14 No. 5, p. e292.
Jøsang, A., Ismail, R. and Boyd, C. (2007), “A survey of trust and reputation systems for online service
provision”, Decision Support Systems, Vol. 43 No. 2, pp. 618-644.
Killock, J. (2019), “We met to discuss BBFC’s voluntary age verification privacy scheme, but BBFC did
not attend”, available at: www.openrightsgroup.org/blog/2019/we-met-to-discuss-bbfcs-
voluntary-age-verification-privacy-scheme-but-bbfc-did-not-attend (accessed 4 July 2019).
Liang, T.P. and Turban, E. (2011), “Introduction to the special issue social commerce: a research
framework for social commerce”, International Journal of Electronic Commerce, Vol. 16 No. 2,
pp. 5-14.
Livingstone, S., Mascheroni, G., Ólafsson, K. and Haddon, L. (2014), Children’s Online Risks and
Opportunities: Comparative Findings from EU Kids Online and Net Children Go Mobile, LSE, London.
Luder, M.T., Pittet, I., Berchtold, A., Akré, C., Michaud, P.A. and Surís, J.C. (2011), “Associations
between online pornography and sexual behavior among adolescents: myth or reality?”,
Archives of Sexual Behavior, Vol. 40 No. 5, pp. 1027-1035.
McNair, B. (2014), “Rethinking the effects paradigm in porn studies”, Porn studies, Vol. 1 Nos 1-2, pp. 161-171.
Newman, G., Clarke, R.V. and Shoham, S. (2016), Rational Choice and Situational Crime Prevention:
Theoretical Foundations, Routledge, London.
Ogas, O. and Gaddam, S. (2012), A Billion Wicked Thoughts: What the Internet Tells us about Sexual
Relationships, Penguin, New York, NY.
Open Rights Group (2019), “Analysis of BBFC age verification certificate standard”, available at: www.
openrightsgroup.org/assets/files/reports/report_pdfs/AV_Security_Standard_Analysis_2.pdf
(accessed 4 July 2019).
Oswell, D. (2008), “Media and Communications Regulation and child protection: an overview of the
field”, in Livingstone, S. and Drotner, K. (Eds), International Handbook of Children, Media and
Culture, Sage, London, pp. 469-486.
Paasonen, S., Nikunen, K. and Saarenmaa, L. (2007), “Pornification and the education of desire”, in
Paasonen, S., Nikunen, K. and Saarenmaa, L. (Eds), Pornification: Sex and Sexuality in Media
Culture, Berg, Oxford, pp. 1-20.
Pinhas, L., Toner, B.B., Ali, A., Garfinkel, P.E. and Stuckless, N. (1999), “The effects of the ideal of
female beauty on mood and body satisfaction”, International Journal of Eating Disorders, Vol. 25
No. 2, pp. 223-226.
Przybylski, A.K. and Nash, V. (2018), “Internet filtering and adolescent exposure to online sexual Protecting
material”, Cyberpsychology, Behavior, and Social Networking, Vol. 21 No. 7, pp. 405-410. children from
Romano, A. (2018), “Tumblr is banning adult content. It’s about so much more than porn”, available at: internet
www.vox.com/2018/12/4/18124120/tumblr-porn-adult-content-ban-user-backlash (accessed
2 July 2019). pornography
Sabina, C., Wolak, J. and Finkelhor, D. (2008), “The nature and dynamics of Internet pornography
exposure for youth”, Cyberpsychology & Behavior, Vol. 11 No. 6, pp. 691-693.
Sinković, M., Štulhofer, A. and Božić, J. (2013), “Revisiting the association between pornography use
and risky sexual behaviors: the role of early exposure to pornography and sexual sensation
seeking”, Journal of Sex Research, Vol. 50 No. 7, pp. 633-641.
Smith, R.S. (1995), “Giving credit where credit is due: Dorothy Swaine Thomas and the “Thomas
theorem”, The American Sociologist, Vol. 26 No. 4, pp. 9-28.
Thomsen, S. (2015), “Extramarital affair website Ashley Madison has been hacked and attackers are
threatening to leak data online”, available at: www.businessinsider.com/cheating-affair-website-
ashley-madison-hacked-user-data-leaked-2015-7?r=US&IR=T (accessed 3 July 2019).
Thomsen, S.R., Weber, M.M. and Brown, L.B. (2002), “The relationship between reading beauty and
fashion magazines and the use of pathogenic dieting methods among adolescent females”,
Adolescence, Vol. 37 No. 145, pp. 1-18.
Vanderhoven, E., Schellens, T., Valcke, M. and Montrieux, H. (2015), “Son, are you on Facebook? The
impact of parental involvement in school interventions about E-safety”, paper presented at the
American Educational Research Association Conference, Chicago, IL, 16-20 April.
VRA (1984), “Video Recordings Act 1984”, available at: www.legislation.gov.uk/ukpga/1984/39/
contents (accessed 10 May 2019).
Wall, D.S. (2013), “Enemies within: redefining the insider threat in organizational security policy”,
Security Journal, Vol. 26 No. 2, pp. 107-124.
Wallmyr, G. and Welin, C. (2006), “Young people, pornography, and sexuality: sources and attitudes”,
The Journal of School Nursing, Vol. 22 No. 5, pp. 290-295.
Waterson, J. (2019), “UK’s porn age-verification rules can be circumvented in minutes”,
available at: www.theguardian.com/society/2019/apr/19/uks-porn-age-verification-rules-can-be-
circumvented-in-minutes (accessed 2 July 2019).
Yar, M. (2018), “A failure to regulate? The demands and dilemmas of tackling illegal content and
behaviour on social media”, International Journal of Cybersecurity Intelligence & Cybercrime,
Vol. 1 No. 1, pp. 5-20.
Yar, M. and Steinmetz, K.F. (2019), Cybercrime and Society, 3rd ed., Sage, London.
Yoti (2019), “How does Yoti work?”, available at: www.yoti.com/business/how-does-yoti-work/
(accessed 30 June 2019).
Zillmann, D. (2000), “Influence of unrestrained access to erotica on adolescents’ and young adults’
dispositions toward sexuality”, Journal of Adolescent Health, Vol. 27 No. 2, pp. 41-44.

Further reading
Phippen, A. (2016), Children’s Online Behaviour and Safety: Policy and Rights Challenges, Springer, London.

Corresponding author
Majid Yar can be contacted at: m.yar2@lancaster.ac.uk

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com

You might also like