Download as pdf or txt
Download as pdf or txt
You are on page 1of 35

Response to the draft Privacy

Legislation Amendment (Enhancing


Online Privacy and Other Measures)
Bill 2021

Reset Australia
Dec 2021
Contents
About Reset Australia & this submission 1

Context & background 2


The issues 2
The solutions 7

Our response to the Bill 8


The requirement for platforms to consider children’s best interests 8
The requirement for platforms to verify the age of users 11
Assuring age for age restricted services: differences 16
Assuring age to enforce minimum age requirements: differences 16
The requirement for platforms to obtain parental consent 20
The scope of the Bill 22
Concerns about the Code Developers 26
Recognising children’s right to participate in the Code co-development process 30

Recommendations 31
About Reset Australia & this submission

Reset Australia is an independent, non-partisan organisation committed to driving public


policy advocacy, research, and civic engagement to strengthen our democracy within the
context of technology. We are the Australian affiliate of Reset, the global initiative working to
counter digital threats to democracy.

This submission reflects Reset Australia’s response to the draft Privacy Legislation
Amendment (Enhancing Online Privacy and Other Measures) Bill 2021. It documents our
policy position, and builds on our work advocating for improved data protections for children
and the expertise that has been shared with us in this process.

We are indebted to our partners in the campaign for a Children’s Data Code, who helped
shape the thinking and policy approaches that underpin this response — particularly their set
of policy asks. More information about this initiative and partners can be found online1, and
many of these partners are also submitting their own responses informed by these policy
asks.

We are also extremely grateful to have supported a coalition of mental health organisations
to develop a submission from their vital and unique perspective, providing technical insights
and expertise to inform their thinking. These highly focussed conversations were hugely
insightful and helped challenge our thinking.

Reset has also prepared a submission reflecting the views of children and young people we
spoke to in the process of our work. Their thoughts and ideas shaped and reshaped much of
our thinking. We have submitted those separately to provide a medium for youth voices, and
hope these are as insightful and useful to the Attorney General’s Department as we found
them.

This submission builds on all of these conversations and initiatives, and reflects Reset
Australia’s broader policy position.

1
Campaign for a Children’s Data Code 2021 https://www.childrensdatacode.org.au/

1
Context & background

The issues

The role of data in shaping childhood, and children’s future lives, is unprecedented. Children
grow up in an environment so rich in data that it would have been impossible to even
imagine 33 years ago, when Australia's existing privacy legislation was adopted.

Children are now ‘datafied’ from conception, as both consumers and subjects of data. As
subjects, their data is harvested and collected before they take their first breath — pregnancy
apps, heartbeat monitors and ultrasounds shared on social media harvest data in utero. This
harvesting continues right through childhood; from AI enabled baby monitors to connected
toys to educational apps and programmes in the classroom.

One estimate suggested that in Advertising Tech alone, over 72 million data points are
collected about children by the time they reach 13.2 And at 13, they are allowed to join social
media platforms where, before their 18th birthday, they will add around 70,000 social media
posts to their data footprint3. Each post provides a ream of data to harvest, such as GPS
location, IP address, data about sentiment analysis, data about interactions with the post and
photos potentially scanned with facial recognition software4. The amount of data that is now
held about the next generation is truly staggering.

As data consumers, all of this data means digital services and platforms know a lot about
children and young people. This includes things that we used to consider private and
intimate, like their vulnerabilities, fears and what brings them joy. This intimate knowledge is
too often used to target and reach children in ways that don’t match community
expectations.

Social media companies, for example, use children’s data to develop ‘sticky’ platforms that
serve up a personalised stream of content designed to keep young people hooked. This starts
a vicious cycle; more time on a platform means more data can be collected, which is in turn
used to create an even ‘stickier’ experience. This data, and the fact that children are ‘stuck on’
these platforms, allows companies to serve them up a steady stream of personalised
advertising.

2
In Donell Holloway 2019 ‘Surveillance Capitalism and Children’s Data: The Internet of Toys and Things
for Children’ Media International Australia 170(1), pp. 27-36 doi.org/10.1177/1329878X19828205
3
Children’s Commissioner of England and Wales 2018 Who Knows What About Me?
https://www.childrenscommissioner.gov.uk/digital/who-knows-what-about-me
4
3 Dylan Williams et al 2021 Did We Consent to This?
https://au.reset.tech/news/did-we-really-consent-to-this-terms-and-conditions-youngpeople-s-data/
Darren Davidson 2017 ‘Facebook targets insecure young people’ The Australian
https://www.theaustralian.com.au/business/media/facebook-targets-insecure-young-people-to-sell-ads/
news-story/a89949ad016eee7d7a61c3c30c909fa6 and
Sigal Samuel 2019 ‘Facebook will finally ask permission before using facial recognition on you’ The Vox
https://www.vox.com/future-perfect/2019/9/4/20849307/facebook-facialrecognition-privacy-zuckerberg#

2
Both this content and advertising create risks for young people. Research has shown that
social media platforms promote content that is ‘engaging’5, which often means risky or
extreme content6 is prioritised in young people’s feeds. Personalised advertising also
generates risks for commercial manipulation. For example, young girls whose data identifies
them as interested in weight loss can be served up advertising about diet teas7, and data that
suggests when kids are feeling vulnerable can be used to sell them products that ‘give them
a boost’8. It’s hard to see how this vicious cycle, and the risks it inherently creates, is good for
children.

But this vicious cycle is not an accident nor an unintended consequence of the way social
media platforms operate. Children and young people are the ultimate subjects and
consumers in this business model. By extracting a lifetime’s worth of data with the one hand,
platforms can use the other hand to flawlessly serve them up products with unparalleled
marketing accuracy for years to come. Evidence suggests that social media platforms seek
out younger consumers in particular to drive growth9. This is a commercial imperative:
targeted advertising to children is a billion dollar industry10 and children are vital future
customers11.

This means that children are not protected from the harms of these data practices and the
digital worlds they create. Rather, they appear to be deliberately sought out and pulled into
these risky environments.

This raises both philosophical and moral questions about the type of childhood we believe
young people deserve. How much risk is acceptable for young people to face? Are children
entitled to privacy as they grow up? Should they experience freedom from datafication? Do
they have the right to be left alone, and freedom of thought to develop ideas and tastes away
from behavioural advertising and persuasive recommender systems? Should children be
treated differently to adults in this new data economy, or is it acceptable for their data to be
harvested and used in the same ways? Or should we find ways to regulate and create digital
environments that advance children’s rights in the first instance?

5
For example YouTube’s algorithm as described in Paul Covington, Jay Adams, Emre Sargin 2016 ‘Deep
Neural Networks for YouTube Recommendations’ RecSys '16: Proceedings of the 10th ACM Conference
on Recommender Systems doi.org/10.1145/2959100.2959190
6
For example, ‘angry’ and inflammatory content is more likely to be engaging (Luke Munn 2020 ‘Angry
by design: toxic communication and technical architectures’
Humanities and Social Sciences Communications www.nature.com/articles/s41599-020-00550-7), and
the prioritisation of ‘engaging’ content has been shown to promote eating disorder material (Ysabel
Gerrard 2018 ‘Beyond the hashtag: Circumventing content moderation on social media’ New Media &
Society 20(12):4492-4511. doi:10.1177/1461444818776611)
7
Dylan Williams, Alex McIntosh & Rys Farthing 2021 Profiling Children for Advertising Reset Australia
https://au.reset.tech/uploads/resettechaustralia_profiling-children-for-advertising-1.pdf
8
Darren Davidson 2017 ‘Facebook targets insecure young people’ The Australian
www.theaustralian.com.au/business/media/facebook-targets-insecure-young-people-to-sell-ads/news-s
tory/a89949ad016eee7d7a61c3c30c909fa6
9
David Swan 2021 ‘Aussie teens dump Facebook’ The Australian
www.theaustralian.com.au/business/technology/aussie-teens-dump-facebook-instagram-leaked-intern
al-research-reveals/news-story/3e0a464b67bd5f45ee0fa00b14bfe551 and
Alex Health 2021 'Facebook's lost generation;’ The Verge
www.theverge.com/22743744/facebook-teen-usage-decline-frances-haugen-leaks
10
Duncan McCann 2021 iSpy: The Billion Dollar Business of Surveillance Advertising to Kids New
Economics Foundation / Global Action Plan neweconomics.org/uploads/files/i-Spy__NEF.pdf
11
Alex Health 2021 'Facebook's lost generation;’ The Verge
www.theverge.com/22743744/facebook-teen-usage-decline-frances-haugen-leaks

3
Reset Australia believes that children and young people have fundamental rights including
the right to privacy, freedom of thought, protection and access to information. While we
ultimately believe that no Australians’ data should be exploited nor used in ways that create
undue risks (including individual and social risks), recognising children’s additional rights
places extra obligations on legislators and regulators to urgently protect them.

We are not alone in this belief. This data hungry environment has been recognised as posing
real risk to children and young people’s rights. The recently published General Comment
from the Committee on the Rights of the Child states that:

The digital environment includes businesses that rely financially on processing


personal data to target revenue-generating or paid-for content, and such processes
intentionally and unintentionally affect the digital experiences of children. ... The
processing of personal data that may result in violations or abuses of children’s
rights, including through advertising design features that anticipate and guide a
child’s actions towards more extreme content, automated notifications that can
interrupt sleep or the use of a child’s personal information or location to target
potentially harmful commercially driven content12.

Beyond the business models of platforms, widespread personal datafication and surveillance
creates broaders risks for children’s rights. UNICEF13 recently categorised and described these
as:

● Surveillance culture threatens children’s freedom and privacy. There is something


innately disturbing about the thought of growing up in a panopticon, and children’s
right to privacy is fundamental to their development. A lack of privacy, and open
surveillance can have a ‘chilling effect on children at a key development stage’14.
Experimentation and trying new things in safe, low-consequence settings is central to
how children and young people learn and develop. From as simple as being able to
learn a musical instrument (and be awful alone in your bedroom at first), to things as
profound as trying out a new political belief (without attracting a life-long label), youth
is a time of much positive, and some negative, experimentation. Privacy affords young
people the ability to experiment with fewer consequences, and therefore creates an
environment that provides freedom to develop.

Affording young people privacy therefore has significant individual and social
consequences. Veliz warns that open surveillance can diminish the little p and capital
P political beliefs and practices of future generations15.

● Poor data protection paves the way for even more surveillance, and for data to be
used in unanticipated and harmful ways. This creates additional long term risks. Data
has a particular problem with permanence; unlike paper it does not fade away unless
it is actively deleted. Data storage costs are now so low that it can be cheaper to store
12
UN Committee on the Rights of the Child 2021 ‘General Comment 25 on Children’s Rights in Relation
to the Digital Environment’
www.ohchr.org/EN/HRBodies/CRC/Pages/GCChildrensRightsRelationDigitalEnvironment.aspx
13
UNICEF 2020 The Case for better governance of children’s data: A manifesto UNICEF
www.unicef.org/globalinsight/media/1741/file/UNICEF%20Global%20Insight%20Data%20Governance%20
Manifesto.pdf
14
UNICEF 2020 The Case for better governance of children’s data: A manifesto UNICEF
www.unicef.org/globalinsight/media/1741/file/UNICEF%20Global%20Insight%20Data%20Governance%20
Manifesto.pdf
15
Clarissa Veliz 2020 Privacy is Power Penguin Publishing, Oxford

4
information than to process and delete it. This means a lot of personal data can be
gathered about children, and could hang around for years to come. This poses risks for
children who (hopefully) have a long time to live with this expansive data footprint.

Already, we see how significant life events for Gen X and Millennials are shaped by
data that is collected about them. Large employers use AI to scan CVs and banks use
as much data as they can to run credit checks and approve mortgages and loans. But
going forward, carrying around a whole lifetime’s worth of data could mean that
events in your past continue to shape your future in ways we cannot even begin to
imagine. For example, data about a mental health issue experienced at age 15 could
be used to deny access to medical insurance in your 60s, or data about what you read
when you were 12 could still be shaping the news you are delivered in your 40s.

For the younger generations, a whole lifetime of data could be used to shape,
determine or limit their eligibility for future opportunities in as yet unforeseen ways.
This implies a need for caution.

● Predictive analytics can amplify existing discrimination and bias. Data and algorithms
are already used to determine a range of childhood experiences, from converting a
NSW HSC score to an ATAR16 to qualify for university, to determining child support
payments17. But this sort of data use and algorithmic reliance comes with inherent
risks and can amplify bias18.

We saw in the UK how a ‘mutant algorithm’ was deployed to downgrade the high
school marks of children from low income areas19. We also saw how the use of
automated debt assessment and recovery — using data from Centrelink and the
Australia Tax Office — drove many Australian families with children into debt20. But
these are only some of the issues that have come to light. Inequalities can be
entrenched in more hidden ways with data because it is difficult to see how these
algorithms and calculations work.

As the world, including public administration and the private sector, moves towards
more ‘data driven’ practices, the experience of growing up will be increasingly shaped
by data collection and use in ways that aren’t alway clear, transparent nor fair.

● Children’s data can be used to manipulate and influence their behaviour. The ways in
which data is used can affect children’s freedom of thought and their inner sanctum.
For example, targeted and behavioural advertising can be ‘used to manipulate
children’s consumption patterns and behaviours, thus infringing on their freedom of

16
UAC 2015 Calculating the Australian Tertiary Rank in New South Wales
www.uac.edu.au/assets/documents/atar/atar-technical-report.pdf
17
Child Support (Assessment) Act 1989 (Cth) Section 12A www.legislation.gov.au/Details/C2016C00954
18
UNICEF 2020 The Case for better governance of children’s data: A manifesto UNICEF
www.unicef.org/globalinsight/media/1741/file/UNICEF%20Global%20Insight%20Data%20Governance%20
Manifesto.pdf
19
Sean Coughlan 2020 ‘A Levels and GCSEs: Boris Johnson blames mutant algorithm for exam fiasco’
www.bbc.com/news/education-53923279
20
Rebecca Turner 2021 ‘Robodebt Condemned’ ABC
www.abc.net.au/news/2021-06-11/robodebt-condemned-by-federal-court-judge-as-shameful-chapter/10
0207674

5
choice and expression’21. Likewise, social media algorithms can shape political beliefs22,
which also affects children’s freedom of thought.

UNICEF’s report23 also notes that there are many challenges in creating rights realising
privacy frameworks for children. For example, they note that; legal frameworks generally
overlook the risks posed by group data profiling; factoring in children’s evolving capacities
and different experiences is difficult; regulation that adequately addresses consent, child
protection and representation is lacking, and; balancing conflicting rights is challenging. The
Enhancing Online Privacy Bill is therefore vital, but by no means attempting a
straightforward task.

The Bill is a welcome step towards realising young people’s rights in the digital environment.
As the General Comment on Children’s Rights in Relation to the Digital Environment24 points
out, state parties including Australia should take:

Legislative, administrative and other measures to ensure that children’s privacy is


respected and protected by all organizations and in all environments that process
their data. Legislation should include strong safeguards, transparency, independent
oversight and access to remedy. States parties should require the integration of
privacy-by-design into digital products and services that affect children.

Advancing children’s rights in the digital world is critical. While there are many risks, Reset
also believes that the digital world can be fantastic for young people, and that children have
the right to access a full and rich digital environment. The digital world, driven by data
collection and use, can be and do great things for young people. From delivering information,
enabling personalised learning apps, supporting friendships and providing opportunities for
play and leisure, good technology can advance rights.

It is a matter of putting in place the right rules and regulations to create a digital world that
advances children and young people’s rights. In 2015, Australia passed some of the world’s
earliest legislation around online safety for children (which has recently been updated and
expanded). We have an opportunity here to create privacy regulations that are world leading,
and help create a digital environment that is brilliant for young people.

21
UNICEF 2020 The Case for better governance of children’s data: A manifesto UNICEF
www.unicef.org/globalinsight/media/1741/file/UNICEF%20Global%20Insight%20Data%20Governance%20
Manifesto.pdf
22
Ujué Agudo & Helen Matute 2021 ‘The influence of algorithms on political and dating decisions’ Plos
One doi.org/10.1371/journal.pone.0249454
23
UNICEF 2020 The Case for better governance of children’s data: A manifesto UNICEF
www.unicef.org/globalinsight/media/1741/file/UNICEF%20Global%20Insight%20Data%20Governance%20
Manifesto.pdf
24
Para 70, UN Committee on the Rights of the Child 2021 ‘General Comment 25 on Children’s Rights in
Relation to the Digital Environment’
www.ohchr.org/EN/HRBodies/CRC/Pages/GCChildrensRightsRelationDigitalEnvironment.aspx

6
The solutions

Reset Australia has previously outlined25 what we believe will be the most effective policy
directions for future tech regulations in Australia. These include:

1. Moving towards a ‘Black Letter Law’ by default approach: Big tech poses big risks.
Future tech regulation needs to start from the premise that self- and co-regulation
will not be sufficient.

Reset Australia believes self- and co-regulation have a role to play in the Australian
regulatory landscape at large, but that sadly the risks posed by the digital
environment are high impact, significant and there are systemic compliance issues
which warrants a pivot towards primary and subordinate legislation and regulation for
the sector. This is especially true when it comes to children and young people.

International developments indicate a shift in this direction too, with both the US and
EU increasingly looking to pass primary legislation, empower existing regulators and
develop new regulatory bodies.

2. Requiring increased transparency: Part of the issue in understanding how to best


solve the problems of the digital world is that legislators, regulators, researchers and
civil society simply do not know enough about the specific mechanics of how
platforms work. Requiring transparency through, for example, algorithmic audits and
impact statements will help provide a better digital environment for Australians.

3. Pivoting towards regulations that target systems and processes: Approaches centred
around increased content moderation (which focus on takedown/deletion of harmful
and illegal content) are not systemic enough, nor are they commensurate with the
scale of the problem at hand. They doom regulators to a perpetual and futile game of
‘content whack-a-mole’, so more upstream policy approaches must also be adopted.

Regulations should focus on the role of platform design, features, systems and
algorithms that manufacture and amplify risks, rather than removing individual
pieces of content that cause harm.

Again, international developments also indicate a shift in this direction too, with both
the EU and UK increasingly looking to pass online safety regulations that focus on the
system’s driving risks and harms26. Notably, this is also true when it comes to
regulations for children and young people; multiple countries have implemented
legislation, regulations or guidance around systems design and data use for young
people27.

Our response to the Bill is informed by this thinking and these global developments.

25
Jessica Hubert, Dhakshayini Sooriyakumaran, Rys Farthing & Elena Yi-Ching Ho 2021 Is Australia’s
Election Under Threat Reset Australia https://au.reset.tech/uploads/reset_facebook_policy_memo.pdf
26
The EU’s proposed Digital Services Act & the UK’s draft Online Safety Bill are shifting from content
moderation and takedown approaches towards regulating the systems and processes of platforms
27
See for example, the UK’s Age Appropriate Design Code 2020, Sweden’s Children and Young People’s
Rights on Digital Platforms, France’s Eight recommendations to strengthen the protection of minors
online 2021, The Netherland’s Code for Children’s Rights 2021, Ireland’s draft Fundamentals for a Child
Oriented Approach to Data Protection 2020 and the US’ proposed PRIVCY Bill

7
Our response to the Bill

The requirement for platforms to consider children’s best interests in


data processing

Section 26KC 6E-F of the Bill states that the Code must require social media platforms to:

(e) in collecting, using or disclosing personal information of children, ensure that such
collection, use or disclosure is fair and reasonable in the circumstances;
(f) in determining what is fair and reasonable for the purposes of paragraph (e), have
the best interests of the child as the primary consideration.

Reset warmly welcomes this requirement. Children’s ‘best interests’ is a well developed
concept within the children’s rights cannon, stemming from Article 3 of the Convention of
the Rights of the Child28. How this principle applies to the digital world is described in General
Comment no 25. (2021) on Children’s Rights in Relation to the Digital Environment, which
notes that:

The best interests of the child is a dynamic concept that requires an assessment
appropriate to the specific context. The digital environment was not originally
designed for children, yet it plays a significant role in children’s lives. States parties
should ensure that, in all actions regarding the provision, regulation, design,
management and use of the digital environment, the best interests of every child is a
primary consideration29.

The principle is a broad and expansive concept, and requires considerable balancing across
children’s rights to be realised. What the best interests principle makes clear though, is that
this ‘balancing act’ must be undertaken to ensure the maximum realisation of the full gamut
of children’s rights, and not balancing children’s rights against commercial interests.

Given this complexity, to be effectively realised, the Code will need to be very clear about
what the best interests principle looks like in practice. The Code developer should be
provided with a necessary list of expectations to ensure this.

Recommendation 1: The Code must clearly outline how the best interests principle
should operate in practice, and Code Developers must be given a clear set of necessary
expectations.

We note that this will not necessarily require any amendments to the Bill itself, but the

28
UN Office of the Human Rights Commissioner 1989 Convention on the Rights of the Child
www.ohchr.org/en/professionalinterest/pages/crc.aspx
29
Paragraph 13, UN Committee on the Rights of the Child 2021 ‘General Comment 25 on Children’s
Rights in Relation to the Digital Environment’
www.ohchr.org/EN/HRBodies/CRC/Pages/GCChildrensRightsRelationDigitalEnvironment.aspx

8
Attorney General’s Department and/or Office of the Information Commissioner should
issue the Code developers with a clear list of ‘necessary expectations’ that they must
include and go beyond. This outcomes-based approach should provide a floor that ensures
the Code delivers on the best interests in practice.

Australia has the opportunity to lead the world by regulating the systems that drive most
platforms. Ensuring that these systems operate in children’s best interests could transform
the digital world. Systems Reset would like to see covered by the best interests requirement
include:

● Recommender systems and algorithms for children — which could have the capacity
to greatly minimise content risks
● Digital marketing and commercial profiling — which could have the capacity to
significantly curb surveillance advertising and minimise commercial risks for children
● Automated decision making — which could have the capacity to reduce short and
long term risks for children, including risks from bias and discrimination
● Testing for ‘persuasive’ design and sticky features, where children’s data is used in A/B
tests for instance — which could have the capacity to reduce overuse and addictive
features.

Recommendation 2: The Code must clearly outline that the best interests principle
applies to uses of children’s data that are central to the functioning of platforms. Code
Developers must be given a clear set of necessary systems to address.

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and/or Office of the Australian Information Commissioner
should issue the Code developers with a clear list of ‘necessary applications’, and systems
and functions, the Code must address and go beyond.

Further, it must be made clear that the Code places obligations on platforms to consider
children’s best interests at multiple stages across the product cycles. What works best for
children needs to be considered:

● Before any data is processed — using Data Protection Impact Assessments


● While data processing is happening — with appropriate ongoing reviews and audits
● Give particular focus to any unintended outcomes of data processing for children —
the role of algorithmic audits in particular is important, to monitor the impact of
algorithms in perpetuating inequality and causing harm

Recommendation 3: The Code must ensure that it creates requirements for children’s
best interests to be considered ‘pre-emptively’ as well as in the ‘outcomes’ from platforms
collection and use of data

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and/or Office of the Australian Information Commissioner

9
should inform Code developers that they expect that best interests will be considered
across the product development cycle. This includes for example, data protection impact
assessments and algorithmic audits.

To meet recommendations one, two and three, the necessary expectations, functions and
process that Code Developers must include and exceed, could for example be:

● The requirement to clearly codify children’s data rights, and set minimum
expectations that the Code:
○ Requires active, expressed consent — Data should only be processed where
children have meaningfully consented and know about it (except where it is in
their best interests). Meaningful consent requires that, among other things,
Terms of Service and Privacy notices must be age-appropriate. This means a
Code must stipulate that:
■ Terms be published in plain speak, accessible to the youngest users (as
per 26KCC in the Bill)
■ Terms must be enforced. Providers should live up to their terms of
service and privacy policies, and children should have a right of redress
if they do not
■ There must be a clear process to ‘ make things right’ where terms are
breached. Children should be able to exercise their rights easily
○ Requires transparency around data processing — Children should be informed
about, and have consented to every instance of, data processing (except where
it is in their best interests). This requires purpose limitation.
○ Requires data minimisation and restricted data sharing — Only strictly
necessary data should be collected, and it should not be circulated (except
where it is in their best interests)
○ Requires accessible ways for children to request, access, correct and delete
their data and request that it not be used (as per 26kC 4a in the Bill)
○ Requires default settings for under 18s accounts to be high privacy and low
risk, and making it clear about any risks associated with changing them
● Make explicit that children’s best interests must be the primary consideration in data
processing around:
○ Recommender systems and algorithms
○ Automated decision making and profiling
○ Digital marketing and commercial profiling
○ Testing and training for ‘persuasive’ design
○ Geolocation data
● Ensure children are considered in advance by requiring children’s data protection
impact assessments to be undertaken before any data about under 18 year olds is
processing data. These must be made available to the OAIC and public on request.
● Ensure that the outcomes from the collection and use of children’s data are
adequately monitored. For example, this may require algorithmic audits. These audits
must be made available to the OAIC and public on request.
● Ensure that children and young people are respected as digital citizens. This means
services can’t shut them out, or downgrade their service because it’s ‘too hard’ to
meet their needs

10
These outcomes build on the Children’s Data Code Campaign30, the UK’s Age Appropriate
Design Code31 and Ireland’s draft Fundamentals32, and align with the rules that 16 & 17 year
olds described when polled in April 202133.

These also align with other recommendations in the draft Bill, notably:

● The requirements in section 25KC G to make privacy notices clear and


understandable, current and provided in a timely manner. We agree with these
requirements, and would like to highlight the need for considerations around clarity
and comprehension for the youngest platform users, and;
● The requirements in section 26kC 4a to ensure an OP organisation must respond to a
request to not use, or to not further use, personal information within a reasonable
period. We also agree with these requirements, and would like to highlight the need
for children and young people to exercise these data rights easily as well.

If effectively realised and enforced, the implementation of the best interest principle has the
capacity to create upstream changes to the way platforms operate, that could improve the
digital world for children.

Children’s best interests must be central to the Code. The additional measures the Bill
proposed for children (age assurance and parental consent) are important steps in moving
towards a digital environment that could achieve children’s rights if done so in ways that
reflect their best interests.

The requirement for platforms to take reasonable steps to verify the


age of users

Section 26KC 6A of the Bill states that the Code must require social media platforms to ‘take
all reasonable steps to verify the age of individuals to whom the OP organisation provides an
electronic service’.

Accurately knowing the age of users is important to ensuring that children’s data can be
handled in age-appropriate ways. The Bill does not stipulate what fair and reasonable steps
towards verifying age might be, but some discussion of possible steps is helpful to
contextualise our response to this requirement.

30
Campaign for a Children’s Data Code 2021 https://www.childrensdatacode.org.au/
31
Age Appropriate Design Code, UK 2020 https://ico.org.uk/for-organisations/
32
Data Protection Commission (Ireland) 2021 Fundamentals for a Child Oriented Approach to Data
Processing
www.dataprotection.ie/sites/default/files/uploads/2020-12/Fundamentals%20for%20a%20Child-Oriented
%20Approach%20to%20Data%20Processing_Draft%20Version%20for%20Consultation_EN.pdf
33
Dylan Williams, Alex McIntosh & Rys Farthing 2021 Keep it to a Limit Reset Australia
https://au.reset.tech/uploads/resettechaustralia_policymemo_pollingreport_final-oct.pdf

11
A note on definitions and language used by Reset Australia 34

Age verification (AV): A system that relies on hard identifiers and/or verified sources of
identification, which provide a high degree of certainty in determining the age of a user.
These are often government backed forms of ID.

Age estimation (AE:) A process that establishes a user is likely to be of a certain age or fall
within an age range..

Age assurance: An umbrella term for both age verification and age estimation solutions.

There are many fair and reasonable steps that platforms could be required to take in order to
assure the age of users. 5Rights Foundation recently released a categorisation of the
techniques currently available35:

Method Example Considerations

Self-declaration. A user enters The most commonly used technique at the


An age estimation their date of moment.
technique birth when
signing up to a This could also include measures which
platform discourage false declarations of age, eg:
● ​If a user enters a date of birth that indicates
they are below the minimum age,
platforms could block repeated attempts
from the same IP address
● Using language that elicits a more truthful
age declaration, for example, “enter your
date of birth” rather than “confirm that you
are over 13”
● Checking a user's date of birth twice. I.e
when a user logs in the second time, ask
them to confirm their date of birth.
Children who gave a false date of birth on
registration may not remember the date of
birth they gave, which could flag them for
moderation

These place the burden on the child to self-declare


their age correctly, and are highly spoofable.

Hard-identifiers A user is asked Hard identifiers are most commonly used for age
such as to upload a copy assurance by services that are restricted to users
government of their passport over 18. The emphasis is on proving users are
backed IDs or Medicare adults.

34
From 5Rights Foundation 2021 But How do they Know it’s a Child? 5Rights Foundation
https://5rightsfoundation.com/uploads/But_How_Do_They_Know_It_is_a_Child.pdf
35
5Rights Foundation 2021 But How do they Know it’s a Child? 5Rights Foundation
https://5rightsfoundation.com/uploads/But_How_Do_They_Know_It_is_a_Child.pdf

12
collected directly number to
by a platform. An check against The use of hard identifiers offers a high level of
age verification official records assurance but presents risks of privacy violations
technique when they open and potential exclusion.
an account. This
is checked to ID documents can be reviewed by the platform or
verify age a third party provider for the platform (see below).

It is unclear if this is fair and reasonable to prove


you are a child so your data can be handled safely.

Biometrics, such A user has their Facial analysis is a widely used form of biometric
as facial analysis. photo run estimation for age and does not — in principle —
An age estimation through an AI recognise nor identify the individual.
technique system to
estimate their Facial analysis compares the user’s facial features
age against large datasets that have been used to train
the technology through machine learning to
estimate their age range.

Facial analysis is inclusive of those who may not be


able to present a valid ID document. It can also be
used in privacy preserving ways if services discard
the facial image once it has estimated a user’s age.

Caution is needed to ensure that the data of facial


features is created in privacy preserving and
inclusive ways, and that images scanned are truly
discarded.

Profiling and Already Profiling and inference are already commonly


inference models, collected user used in commercial settings, including to estimate
such as noting data, such as the age range of users.
that those who what each user
watch unboxing ‘liked’, who their This creates significant tension for children’s right
videos may be friends are etc., to privacy, and there is a need to ensure that
children. An age is scanned and inferences are inclusive and accurate.
estimation inferences
technique about their age
calculated

Capacity testing, A user may be Capacity testing allows a service to estimate a


such as asking a asked to user’s age based on an assessment of their
user to complete a complete a aptitude or capacity.
puzzle. An age language test,
estimation solve a puzzle or Services can use capacity testing to assure age
technique undertake a without collecting personal data from children.
task that gives
an indication of These are not commonly in use, can be easily
their age range spoofed, and capacity is not always aligned with
age.

13
Cross-account A user may be Authorising accounts are often with large
authorisation, asked to ‘log in’ companies such as Apple, Facebook, Google or
where a child uses to their Apple Twitter.
an existing account to
account to gain download an The method is dependent on the age assurance
access to a new age restricted used by the authentication account providers (e.g
product or service. app Apple or Facebook), and it is unclear if these
Can be either an providers are able to assure a range of ages.
age estimation or
age verification Raises concerns around data sharing practices,
technique and entrenching the role of large companies into
the architecture of the digital world.

Account holder When a child’s A child’s age or age range can be confirmed by an
confirmation, profile is created adult, often a parent or carer.
such as asking a (e.g. on Disney+),
parent to confirm the account What is accessible to children in ‘children’s profiles’
the age of a child. holder who pays is decided by the device provider.
An age estimation for the service is
technique asked to input Requires children to have a caregiver to provide
the child’s age confirmation, which can be a barrier for children in
alternative care arrangements.

These place the burden on the parents to


self-declare their age correctly, and are spoofable.

Device/operating When a device A child’s age or age range can be confirmed by an


system controls, is formatted and adult, often a parent or carer.
which uses a connected to a
device’s ‘parental family account, What is accessible to children on ‘children’s
control settings’ to (e.g. Google devices’ is decided by the device provider.
assure age. An Family), the
age estimation account holder Requires children to have a caregiver to provide
technique whose pays for confirmation, which can be a barrier for children in
the service is alternative care arrangements.
asked to input
the child’s age These place the burden on the parents to
self-declare their age correctly, and are spoofable.

Flagging, where When any user Places the onus on platform users to identify and
other users are comes across an report underage accounts.
enabled to ‘flag’ account they
accounts that believe to be Only works after an underage child has created an
seem to be ‘too young’ for account.
underage for the platform,
platform they are able to It is unclear how moderators assure the age of the
moderators to flag this account child once their account is flagged.
review. An age to moderators
estimation to review.
technique

14
Digital Identities A user creates a Can allow users to share only the attributes
through third digital identify required to prove their identity or age.
party providers. with a known
Third party provider (e.g The use of digital identities can reduce the need
providers collect Yoti), and then for users to repeatedly provide documents or other
ID documents or uses their Yoti official sources of information. It has the potential
credentials (e.g. ID to prove their to minimise data sharing whilst providing a robust
passports, or facial identity to a measure of age.
scans) and store platform
them as digital These depend on third party companies that also
‘wallets’. Can be create privacy and security risks.
age estimation,
but often Age The level of assurance depends on what is
Verification collected by the third party.

Age tokens A user creates Age tokens contain only the information relating
through third an account with to the specific age or age range of a user, allowing
party providers. a known platforms to establish if a user meets their age
Third party provider, and requirements without collecting other personal
providers collect then uses their information.
ID documents or account to
credentials (e.g. prove their age Age tokens may not give a user’s actual age and
passports, or facial to a platform only provide confirmation that a user has passed
scans) and create or failed the service’s required age (e.g are they 16
digital age tokens. or over). Age tokens can be generated from a
Can be age digital ID.
estimation, but
often Age These depend on third party companies though
Verification that also create privacy and security risks.

The level of assurance depends on what is


collected by the third party.

B2B age A user creates These often follow the same process to the hard
assurance, an account on a identifier scheme, but undertaken by a third party
through third platform, provider. Many commercial entities offer
party providers. triggering a background identity or age checks.
Third party third party
organisations company to do It is unclear how often users know a third party is
check the age or a background involved in the assurance process. This also creates
identity of users. age check data sharing risks.
Can be age
estimation, but
often Age
Verification

We would expect that the Code would outline a range of reasonable steps platforms could
take to assure young user’s age, and that a combination of approaches may be necessary for
each platform to implement. This is in keeping with the approach advocated by the Office of

15
the eSafety Commissioner to verify ages for access to pornography, where the Commissioner
argued the need for a ‘combination and layering of technological solutions’36.

However, the ‘bar’ for proving that someone is a child so that their data can be handled in
safer and more precautionary ways should be low. It would be a perverse outcome if a child’s
data were to be processed in high risk or dangerous ways because they lacked the data to
adequately convince a platform they were a child.

Assuring age for age restricted services: differences

What is considered fair and reasonable steps for age assurance must depend on why the
assurance is needed. The age verification requirements to access Restricted Access
Services, and those addressed by the Protecting the Age of Innocence37 report, will
presumably be stronger and aim to reduce ‘false positives’ (i.e. keep children out of adult
services). For the basis of ensuring data is handled in age-appropriate and precautionary
ways, fair and reasonable steps must aim to reduce ‘false negatives’ (i.e keep children in
safer data practices). That is, where there is doubt, data must be treated carefully as if it
were children’s. The burden of proof for age-appropriate data handling must be lower than
for age restricted services.

Assuring age to enable better enforcement of minimum age requirements: differences

Assuring age for the purposes of ‘enforcing minimum age requirements’ is also different to
assuring age to ‘enable age-appropriate, precautionary data protection practices’. One
intent speaks to a need to minimise false positives (i.e. keep children out of a service) and
the other false negatives (i.e keep children in safer data practices). The burden of proof for
age-appropriate data handling must be lower than for enforcing minimum ages.

However, we recognise that requiring platforms to take steps to assure age could reduce
the number of underage users on platforms through two mechanisms:

● They could discourage or prevent underage users joining platforms. While the process
of assuring age is technically complex, requiring platforms to take reasonable steps to
assure age would create some friction in what is currently a ‘friction free’ joining
process for young people. This ‘lack of friction’ enables children under minimum age
requirements to join platforms very easily, with almost no checks beyond self
declaration38. Currently anyone, including a child, can join TikTok in around one minute
and 5 clicks39. This is an easy and friction free user experience by design. Adding a
degree of friction to the process, involving multiple methods of age assurance, may

36
Office of the eSafety Commissioner 2019 ‘Submission to the House of Representatives Standing
Committee on Social Policy and Legal Affairs is inquiring into age verification for online wagering and
online pornography to prevent children and young people from accessing harmful products’
37
House of Representatives Standing Committee on Social Policy and Legal Affairs 2020 Protecting the
Age of Innocence Commonwealth of Australia
https://parlinfo.aph.gov.au/parlInfo/download/committees/reportrep/024436/toc_pdf/Protectingtheageof
innocence.pdf;fileType=application%2Fpdf
38
This friction free process can also mean users are unaware that they are agreeing to data processing.
Terms of service are often hidden, or associated to unconnect actions (e.g. clicking next or ‘sign up’ is
taken as an assumption that you agree with the terms of service).
39
Counting one click to submit a date of birth, a second to enter a mobile number or email which they
send a confirmation code to, a third click to enter this code, a fourth to undertake a Captcha test and
the last to create a password

16
discourage or prevent some underage users from signing up to a platform in the first
instance.

● It could enable platforms to better enforce their minimum age requirements. If


platforms are better able to assure the ages of their users, they can in theory take more
effective steps to remove underage users from their services.

Using a service that is designed for older people can be risky for children, and is not an
insignificant issue. Research from the eSaftey Commissioner outlines that in the year to
June 2017 the majority of Australian children aged 7-12 years old had used a social media
platform40.

Platform % of Australian children Minimum age for


8-12 who have used the platform (according to
platform their terms of service)

YouTube 80% 13

Facebook 26% 13

Instagram 24% 13

Snapchat 26% 13

Google+ 23% 13

Twitter 7% 13

Musical.ly (now merged with TikTok) 15% 13

While the minimum age for these platforms appears to stem from a desire to comply with
US privacy legislation41 — rather than any attempts to design age-appropriate products for
13 year olds, or research documenting children’s capabilities to thrive with social media use
at 13 — some evidence suggests that social media products pose particular risks for the
under 13s.

The same research from the eSafety Commissioner compared steps taken to actively
manage ‘online safety’ between children 8-12 versus 13-17 years old, such as changing
privacy settings, filters or blocking people42. They found that 48% of children between the
ages of 8 and 12 had not taken any active steps, compared to 13% of 13-17 year olds. We do
not believe that safety nor privacy should be left to individual children of any age to
actively manage but that platforms should bear the burden of the responsibility. However,
until platforms improve the safety and privacy of their services this suggests that children
under 13 will face risks they are less equipped to mitigate.

40
Office of the eSafety Commissioner 2018 State of Play - Youth, Kids and Digital Dangers
www.esafety.gov.au/sites/default/files/2019-10/State%20of%20Play%20-%20Youth%20kids%20and%20digi
tal%20dangers.pdf
41
The Children’s Online Privacy Protection Act (COPPA) which places more stringent requirements on
online services collecting data from under 13 year olds. Companies can avoid being bound by these
requirements if they have no actual knowledge of under 13 year olds using a platform
42
Office of the eSafety Commissioner 2018 State of Play - Youth, Kids and Digital Dangers
www.esafety.gov.au/sites/default/files/2019-10/State%20of%20Play%20-%20Youth%20kids%20and%20digi
tal%20dangers.pdf

17
And there is evidence that these risks can lead to harm. The eSafety Commissoner’s
research also documents that 24% of children aged 8-12 have negative online experiences
(compared to 42% of 13-17 year olds). Likewise, US research suggests that a worrying 19% of
children aged 9-12 experience an online sexual interaction with someone they believe to
be an adult (compared to 29% of 13-17 year olds)43.

Both discouraging underage people from joining platforms and removing them when
they are found on platforms would help reduce the risks and harms faced by children
under 13. But they are not sufficient alone, and the focus must remain on requiring
services to improve their platforms in the first instance. Children over 13 continue to face
these risks and harms too.

Recommendation 4: The Code must ensure that reasonable steps to assure age for the
purpose of ensuring age-appropriate data handling practices are precautionary. If in
doubt, it must be assumed that data could belong to a child and handled accordingly.

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and/or Office of the Information Commissioner should
issue the Code developers with guidance.

Ensuring that children’s data is treated in a safe and age appropriate way should not create
unnecessary risks for the realisation of other rights.

While it is unlikely that a Code would specify that hard identifiers alone are the only fair and
reasonable step, it is worth noting that this could have implications for children’s right to
access the digital environment. This may not always be in children’s best interests.

Specifically, the requirement for hard identifiers could create disproportionate barriers for
young people from low socio-economic households, or otherwise disadvantaged
backgrounds. Birth certificates and passports — by and large the two forms of government
backed ID available to under 16 year olds — are expensive documents to purchase, keep safe
and replace. Further, not all young people have access to these documents, especially young
people with irregular immigration status. Without access to these, a requirement for hard
identifiers may create new forms of digital disadvantage for already disadvantaged young
people.

Recommendation 5: The Code must ensure that reasonable steps to assure age does not
produce barriers to children from low socio-economic households, or otherwise
disadvantaged young people.

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and/or Office of the Information Commissioner should
issue the Code developers with guidance.

43
Thorn 2021 Responding to Online Threats Thorn
https://info.thorn.org/hubfs/Research/Responding%20to%20Online%20Threats_2021-Full-Report.pdf

18
Multiple techniques, including hard identifiers and digital identities, also have implications
for children’s right to privacy and security. Even where copies of identity documents are not
kept, sharing and processing highly sensitive ID documents creates privacy and security risks.
Likewise, many biometric methods create other privacy concerns. The Code’s description of
fair and reasonable steps must factor in privacy and security concerns.

Recommendation 6: The Code must ensure that reasonable steps to assure age consider
children’s privacy and security.

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and/or Office of the Information Commissioner should
issue the Code developers with guidance.

We note that the UK is currently exploring the options to develop a statutory code providing
guidance around Age Assurance44. This outlines 11 minimum requirements for age assurance
systems, indicating that they must:

1. ​Protect the privacy of users in accordance with applicable laws


2. Be proportionate with regard to the risks arising from the product or service and to
the purpose of the age assurance system
3. Offer functionality appropriate to the capacity and age of a child who might use the
service
4. Be secure, does not expose users or their data to unauthorised disclosure or security
breaches, and does not use data gathered for the purposes of the age assurance
system for any other purpose
5. Provide appropriate mechanisms and remedies for users to challenge or change
decisions if their age is wrongly identified
6. Be accessible and inclusive to users with protected characteristics (e.g. disabilities,
ethnic diversity etc)
7. Not unduly restrict access of children to services to which they should reasonably have
access, for example, news, health and education services
8. Provide sufficient and meaningful information for a user to understand its operation,
in a format and language that they can be reasonably expected to understand,
including if they are a child
9. Be effective in assuring the actual age or age range of a user as required
10. Not rely solely on users to provide accurate information
11. Be compatible with; the Data Protection Act 2018, the Age Appropriate Design Code,
the Human Rights Act 1998, the Equality Act 2010, and the United Nations Convention
on the Rights of the Child and General Comment No. 25 (2021) on Children’s Rights in
Relation to the Digital Environment.

These requirements may be useful in developing any guidance for Code developers.

44
Age Assurance Minimum Standards Bill 2021, UK
https://bills.parliament.uk/publications/41683/documents/325

19
The requirement for platforms to obtain parental consent, and take
reasonable steps to verify this consent

Section 26KC 6B-E of the Bill states that the Code must require social media platforms to:

(b) obtain the consent of a parent or guardian of a child who has not reached 16 years
before collecting, using or disclosing personal information of the child
(c) if the OP organisation becomes aware after it collects, uses or discloses personal
information of an individual that the individual is a child who has not reached 16
years, obtain the consent of a parent or guardian of the child as soon as
practicable after becoming so aware
(d) take all reasonable steps to verify the consent obtained for the purposes of
paragraph (b) or (c)

Current Australian guidelines stipulate that — where individual capacity to consent cannot
be assessed — parental consent should be obtained for for under 15 year olds45. However
these guidelines are routinely overlooked by social media platforms. Requirements 26KC
6B-E would increase the age, and place a responsibility on platforms to comply.

Currently, parental consent is most often obtained through self declaration46. For example,
children are asked for their guardians details when they join a service or children’s accounts
are linked to their guardians accounts when they are created. Parents therefore self declare
that they are a child’s guardian, and then consent or decline to data processing. Like age
assurance techniques, there are no technically perfect or foolproof methods but a range of
approaches are possible.

Requirements around parental and guardian consent can be complex to implement in ways
that advance children’s rights, and require caution. Parental consent can be a blunt
instrument that could reduce the realisation of a range of children’s rights. Specifically, it can
interfere with young people’s right to access the digital world. Many platforms that fall into
the scope of the Bill (or could fall under the Code in future) provide services that children
should be able to access, such as news, information and preventative services.

Requirements to obtain parental consent must also consider children’s evolving capacities
and reasonable expectations of privacy and independence from their parents. These evolve
as a child ages, and what may be appropriate for a 9 year old on Messenger Kids may not be
the same as a 15 year old on Instagram. However as currently drafted, the Bill may see both
treated identically.

To appropriately balance children’s rights and realise their best interests, section 26KC 6 (b)
needs to be significantly revised to a requirement to take fair and reasonable steps to obtain
the consent of a parent or guardian of a child, and clauses added to ensure that in
determining what is fair and reasonable consideration must be given to children’s best
interests, evolving capacity and how their other rights — including and especially the right to
access the digital world — may be affected.

45
Office of the Australian Information Commissioner 2019 Australian Privacy Principles Chapter B,
Clause B.58 www.oaic.gov.au/privacy/australian-privacy-principles-guidelines/chapter-b-key-concepts
46
Simone van der Hof & Sanne Ouburg 2021 Methods for Obtaining Parental Consent and Maintaining
Children’s Rights euConsent
https://euconsent.eu/download/methods-for-obtaining-parental-consent-and-maintaining-children-rig
hts/

20
Recommendation 7: The requirement for parental consent should be significantly revised
to ‘take fair and reasonable steps’ to obtain parental consent, and that in determining
what is considered fair and reasonable consideration must be given to children’s best
interests, their evolving capacity and not to unduly restrict access of children to services
they should reasonably be able to access.

Section 26KC (b) of the draft Bill should be amended to reflect this47.

Any methods to obtain parental consent requirements would also need to consider the
needs of young people living in kinship or alternative care arrangements, or households and
families where consent may not be straightforward to obtain.

Recommendation 8: The Code must ensure that the requirement to obtain parental
consent does not present disproportionate barriers to children in alternative care
arrangements.

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and/or Office of the Information Commissioner should
issue the Code developers with guidance.

Parental consent methods must also be privacy preserving. Following guidance from the
euConsent project, this means where possible avoiding “verification methods that use
sensitive personal data or automated profiling, unless it is demonstrably in the best interests
of the child and age appropriate safeguards are in place”48.

Recommendation 9: The Code must ensure that the requirement to obtain parental
consent considers privacy and security.

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and/or Office of the Information Commissioner should
issue the Code developers with guidance.

47
This would also allow flexibility for platforms that provide the ‘privacy preserving’ option of not having
to create an account to view content. Where platforms provide access to content without requiring a
login or account creation, the necessity to obtain parental consent may create barriers for all users (child
and adult) who do not have an account. Provided such content is not a restricted access service, it
would be an unhelpful outcome if child and adult users had to hand over more data to create accounts
to comply with the parental consent requirement. Any data obtained from users without accounts
would therefore need to be treated in low risk ways in case it is, and as if it were, children’s data.
48
Simone van der Hof & Sanne Ouburg 2021 Methods for Obtaining Parental Consent and Maintaining
Children’s Rights euConsent
https://euconsent.eu/download/methods-for-obtaining-parental-consent-and-maintaining-children-rig
hts/

21
Requiring parental consent is not a sufficient step and would not significantly improve the
digital environment that children experience. Instead, it places responsibility on parents to
decide if platforms can or cannot process their children’s data. As a matter of principle,
parents should not be asked to consent to risky data processing, nor data processing that is
not in their child’s best interests, in the first instance. The focus must remain on ensuring that
all data processing is in children’s best interests.

The scope of the Bill, and ensuring that all organisations that
undertake ‘risky’ processing of children’s data are covered

As it is currently drafted the Bill proposes only applying the best interests principle to social
media platforms. While we agree that social media platforms pose significant risks to
children and their data, so too do data brokers and large online platforms.

For example, data brokers hold reams of information about families including children, and
children themselves49. For example, a quick online search found one Australian data
brokerage selling access to families based on:

● Details about 150,000 children’s dates of birth, gender and postcode, and mother’s
contact details50. This data was initially harvested from a ‘baby gift pack scheme’,
where mothers collect a baby gift pack from a leading national retailer when their
baby is between 2 and 4 months old.
● Details about nearly 15,000 children’s year of birth and postcode, and parents contact
details,51 harvested when parents responded to a mail offer for a personalised baby
product. The product was described as a ‘special celebration memento announcing
the baby’s birth details such as; name, birth date, birth time, weight and length.... The
memento is the ideal and special keepsake’

It is unclear what safeguards are in place to ensure that children’s data is protected when it is
sold or rented, nor how a trade in their data of birth reflects their best interests.

Many large digital platforms also present risks. For example:

● AdTech platforms collect and use significant data about children and it is unclear if
this is in children’s best interests (see case study below). While ads are often placed in
social media feeds this is not always the case, and they are also placed in other
platforms and websites. AdTech itself would therefore benefit from being covered by
the best interests principle.

49
For example, there are international cases of data brokers selling information by school grade. Eg
Christian Hetrick 2021 ‘NJ data broker tries to sell information on a million kids’ Philadelphia Inquirer
www.inquirer.com/business/technology/alc-princeton-data-broker-personal-info-million-kids-vermont-l
aw-20190319.html
50
The List Group, nd ‘Baby Gift Pack’ https://thelistgroup.com.au/wp-dynamic/list-detail.php?card-id=11
51
The List Group, nd ‘Baby product buyers’
https://thelistgroup.com.au/wp-dynamic/list-detail.php?card-id=12

22
Case study: is micro targeted commercial advertising in children’s best interests?

Micro targeted (or surveillance) advertising is the practice of targeting advertising to those
most vulnerable to an ad. It differs substantially from traditional and contextual advertising.
Young people’s online behaviour is tracked, and their data is fed into powerful algorithms
that calculate the likelihood of each child ‘interacting’ with each ad. Ads with a high
likelihood of interaction are then promoted into a child’s feed. Given the AdTech
environment, these ads will often ‘follow’ young people across different platforms. This is
very different to billboard advertising products to young people — this is a billboard that
uses machine learning techniques to decide who to ‘stand in front of’ and then follows
them around town.

The power asymmetry, between a young person and this AI enabled advertising practice,
produces a range of issues:

● Young people and families are troubled by this. 82% of 16 & 17 year olds have come
across ads that are so targeted they felt uncomfortable52 and 65% of parents were
uncomfortable with businesses targeting ads to children based on information they
have obtained by tracking a child online53
● Children struggle to distinguish between advertising and information in general,
but this is exacerbated by these sorts of digital advertising techniques54
● Advertising and increased commercial pressures on children can lead to
consumerism55, disappointment, frustration and parent-child conflict56

‘It concerns me how accurately advertisers can target me from things I was unaware they could
collect data from’ - Young man 17, South Australia

● EdTech platforms, who develop data heavy products specifically targeting children.
Many of these are prescribed by state funded schools for children’s use.

Education data presents risks for children, far beyond grades and report cards. School
administration often includes sensitive behavioural, health, social service and other
childhood ‘outcomes’ data. UNICEF notes that “Ed Tech software may be used to

52
Dylan Williams, Alex McIntosh & Rys Farthing 2021 Keep it to a Limit Reset Australia
https://au.reset.tech/uploads/resettechaustralia_policymemo_pollingreport_final-oct.pdf
53
Office of the Australian Information Commissioner 2020 Australian Community Attitudes to Privacy
https://www.oaic.gov.au/__data/assets/pdf_file/0015/2373/australian-community-attitudes-to-privacy-surv
ey-2020.pdf
54
Laura Owen, Charlie Lewis, Susan Auty, Moniek Buijzen 2012 ‘Is children’s understanding of
non–traditional advertising comparable to their understanding of television advertising?’ Journal Public
Policy Mark. 32(2):195–206 doi.org/10.1509/jppm.09.003
55
Helga Dittmar, Rod Bond, Megan Hurst & Tim Kasser 2014’The relationship between materialism and
personal well-being: A meta-analysis’ Journal of Personality & Social Psychology, 107, 879 924
doi.org/10.1037/a0037409
56
Matthew Lapierre, Frances Fleming-Milici, Esther Rozendaal, Anna R McAlister, Jessica Castonguay
2017 ‘The Effect of Advertising on Children and Adolescents’ Pediatrics Nov;140(Suppl 2):S152-S156.
doi:10.1542/peds.2016-1758V
Sandra Calvert 2008 ‘Children as Consumers: Advertising and Marketing’ Future Child Spring
18(1):205-34. doi: 10.1353/foc.0.0001
Moniek Buijzen, Patti Valkenburg 2003 ‘The effects of television advertising on materialism, parent–child
conflict, and unhappiness: a review of research’. Journal Applied Developmental Psychology
24(4):437–456 doi.org/10.1016/S0193-3973(03)00072-8

23
support school administration, to enable virtual classrooms, or to monitor student
behaviour. Data collected through this technology is used to predict outcomes for
individual students and schools as well as for child protection and security through
applications that block certain websites or flag students who are deemed to be at risk
of engaging in what are considered negative behaviours57.” Data collected in schools is
often extremely sensitive.

While some of this data remains in the hands of states and educational authorities,
the pandemic dramatically increased the use of commercial EdTech products in the
classroom. The scale of access this creates warrants regulation: the Australian EdTech
sector doubled in the two years before the pandemic58, and every indication suggests
an explosion since then. While Australian data is still being collected, internationally
Google Classroom as an example reported an increased from 40 million users in 2020
to 150 million users in 202159.

Much of this data harvesting happening within schools is not, nor should not be,
managed by schools themselves. Schools often do not have the legal nor technical
skills to understand data risks60, which leaves them less effective at holding EdTech
providers to account.

These risks can also not be managed by children and families. Given the collective
nature of classroom teaching, children can rarely opt out of EdTech products without
consequences for their educational experience.

On the other hand, EdTech providers are often skilled in interpreting the law, and their
scale means they are able to collect significant amounts of children’s data61. To
address these risks and power imbalances, EdTech platforms must be required to
process data in ways that prioritise children’s best interests.

● Health and Wellbeing platforms. Many commercial apps collect extremely sensitive
data about users, sometimes exclusively attracting vulnerable young people. The
sensitivity of this data means it can pose significant risks for young people.

● Game developers and mobile gaming platforms may be developing products that use
children’s data in risky ways. For example, children’s data can be used to develop
games that are extremely sticky or potentially addictive. Extending game play
presents significant commercial opportunities, often through in-app game purchases
or loot boxes, but it is unclear if this is in children’s best interests.

57
Lindsay Barrett 2020 Issue Brief on Student Data Governance UNICEF Office of Global Insight and
Policy www. UNICEF.org/globalinsight/reports/governance-student-data
58
Deloitte 2019 The Australian Ed Tech Market Census Deloitte
www2.deloitte.com/content/dam/Deloitte/au/Documents/public-sector/deloitte-au-ps-australian-edtech
-market-census-210720.pdf
59
Ben Williamson 2021 ‘Google’s plans to bring AI to education make its dominance in classrooms more
alarming’ Fast Company www.fastcompany.com/90641049/google-education-classroom-ai
60
Emma Day 2021 Governance of data for children in UK State Schools Digital Futures
Commission/5Rights Foundation
https://digitalfuturescommission.org.uk/wp-content/uploads/2021/06/Governance-of-data-for-children-l
earning-Final.pdf
61
Emma Day 2021 Governance of data for children in UK State Schools Digital Futures
Commission/5Rights Foundation
https://digitalfuturescommission.org.uk/wp-content/uploads/2021/06/Governance-of-data-for-children-l
earning-Final.pdf

24
Where data brokers or large online platforms deal with significant amounts of children’s data
they must be bound by the best interests principle too. Section 26KC 6(e) & (f) — the best
interests requirement — must be applied to OP organisations described in section 6W 1
through 8.

Undertaking fair and reasonable steps to assure user’s age would be necessary to ensure
platforms could establish that children’s data was being handled in their best interests.

Given the capacity of larger online platforms to function in children’s best interests, such as
providing news, information, educational material or counselling and preventative services,
requirements for parental consent should not be applied to this group. This would ensure
that children could access vital services without needing to make disclosures to their parents.

Recommendation 10: Requirements around the best interests principle and age assurance
must be applied to larger online platforms and data brokers.

Section 26KC (e) & (f) of the draft Bill should be amended to reflect this.

If this recommendation is adopted, to be effective the definition of large online platforms


would need to be expanded to include children. The definition should be revised to 2.5m end
users or 560k end users under 18, reflecting a representation of 10% of Australia’s under age
population62.

Recommendation 11: The definition of Large Online Platforms be expanded to include a


proportional provision for under 18 year olds.

Section 6W4A of the draft Bill should be amended to reflect this.

62
ABS 2020 National State and Territory Population
https://www.abs.gov.au/statistics/people/population/national-state-and-territory-population/latest-releas
e#data-downloads-data-cubes

25
Concerns about the Code Developers

The Code developer must hold the necessary expertise to advance children’s rights. We
respect the role of co-regulation in the Australian regulatory environment more broadly, we
have significant concerns about the effectiveness of co-regulation in the tech sector. This
includes concerns around the expertise of the social media sector to draft a Code that realises
children’s best interests.

We do not believe that the sector has demonstrated appropriate expertise around children’s
best interests for three key reasons:

1. The Social Media Sector has systemic compliance issues with existing children’s
privacy regulations

Platforms have been repeatedly accused of, and fined for, breaching existing children’s
data protection and privacy laws. This does not suggest a strong expertise around
advancing children’s privacy and improving data practices. Indeed it suggests that their
practical expertise lags behind existing regulation already. For example:

● Earlier this year, the Texas Attorney General accused Google of monopolistic
practices in deliberately stalling efforts to strengthen children’s online privacy
laws in the US, and documented Google executives ‘bragging’ about stalling
EU attempts at improving consumer privacy63
● TikTok are currently facing a £1b plus lawsuit led by the UK’s former Children’s
Commissioner for excessive data collection practices64
● YouTube are also facing a £2b ‘class action’ for unlawfully tracking and
collecting children’s data65
● In 2019 FTC settled cases with:
○ Musical.ly (now TikTok) for using children’s data without the necessary
parental consent, for $5.7m USD66
○ Google and their subsidiary YouTube for using children’s data without
necessary parental consent, for $170m USD67

2. The Social Media Sector engages in data practices that are demonstrably not in
children’s best interests

Platforms also process data in a range of worrying ways, which we would hope the Code
would specifically prohibit. There is no reason platforms must use children’s data in risky

63
Leah Nylen 2021 ‘Google sought feelow tech giants help is stalling kids privacy protections’ Politico
www.politico.com/news/2021/10/22/google-kids-privacy-protections-tech-giants-516834
64
BBC 2021 ‘TikTok sued for billions over use of children’s data’ BBC
www.bbc.co.uk/news/technology-56815480
65
YouTube Data Claim 2020 ‘YouTube Data Claim’ www.youtubedataclaim.co.uk/
66
FTC 2019 ‘Musical.ly agrees to settle’
www.ftc.gov/news-events/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ft
c
67
FTC 2019 ‘Google and YouTube will Pay Record $170m for Alleged Violations of Children’s Privacy Law’
www.ftc.gov/news-events/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-vio
lations

26
or privacy-reducing ways; these are business decisions made by each platform reflecting
a prioritisation of commercial interests over children’s best interests. This again indicates
a lack of understanding about children’s best interests and how to advance their privacy.
For example:

● A range of social media platforms — including Instagram, Twitch, Twitter,


Reddit, TikTok and Spotify — deploy deceptive dark patterns when presenting
their privacy policies and terms of service68. These are design choices that
platforms have made to present their policies in deliberately unclear and not
understandable ways, contrary to the intent of the Bill.

● The collection of geolocation data is used to drive unsafe practices. We would


hope the Code would make clear that geolocation data should be ‘off’ by
default for under 18 year olds.

Case study: SnapMaps and Geolocation data

Snapchat has a feature called “Snap Map” that broadcasts a user's location in real time to anybody in
their contact list. All Snapchat users are also able to browse this map and click on any location to see
popular videos and photos by users in that area, even those posted by people who are not in their
contact lists.

Using Snap Map we were able to navigate to a local high school and view content that had been
broadcasted from inside a classroom, including videos of school children.

1. Left: A screenshot from Snap Map. The cyan ‘heat zone’ is located over the entrance to a high
school, signaling that there is lots of content to look at if you click on the school gates.
Clicking on it takes you straight into the content.
2. Right: A screenshot of some video content that is publicly available from inside a school, in
this case a bunsen burner in a science lab. For ethical reasons, we are highlighting the least
identifying image we found publicly available.

68
Dylan Williams, Alex McIntosh & Rys Farthing 2021 Did we really consent to this?
https://au.reset.tech/news/did-we-really-consent-to-this-terms-and-conditions-young-people-s-data/

27
● Many platforms use children’s data to drive recommender systems that
promote worrying content to children’s accounts. For example, TikTok’s
follower recommendations will ‘learn’ that users are interested in accounts
associated with eating disorders and suggest similar accounts for them to
follow69.​

A screenshot from a research account on TikTok70, which is being recommended eight


accounts to follow, all of which appear to be related to eating disorder content (TW is
short for trigger warning, ED eating disorder, Ana anorexia, DNI Do Not Interact).

69
Suku Sukunesan 2021 ‘Sexual-predators online are targeting teens wanting to lose weight platforms
are looking the other way’ The Conversation
https://theconversation.com/anorexia-coach-sexual-predators-online-are-targeting-teens-wanting-to-lo
se-weight-platforms-are-looking-the-other-way-162938. Note: This is not an inevitable function of
recommender systems. TikTok appear to have been able to train their content recommender systems
against promotion weight loss content in their For You Page, but appear not to have done so for
suggested accounts to follow (See Dylan Williams, Alex McIntosh & Rys Farthing 2021 Surveilling young
people online https://au.reset.tech/uploads/resettechaustralia_policymemo_tiktok_final_online.pdf)
70
Suku Sukunesan 2021 ‘Sexual-predators online are targeting teens wanting to lose weight platforms
are looking the other way’ The Conversation
https://theconversation.com/anorexia-coach-sexual-predators-online-are-targeting-teens-wanting-to-lo
se-weight-platforms-are-looking-the-other-way-162938.

28
3. The Social Media Sector does not have sufficient expertise when it comes to
developing effective Codes

Recent experience of the social media sector’s involvement in Code development does
not suggest they are well equipped for the task. The mis- and disinformation code,
developed by Digi, has been roundly criticised as weak. The ACMA themselves outlined
that the Code ‘failed to meet expectations’71. Given the voracity of community
expectations around children’s online safety and privacy, and the scale of the risks
children face, it would be worrying if the same industry representative were involved in
developing the Online Privacy Code.

Our preference would be for the Information Commissioner to draft the Code in the first
instance. However, we understand that the process as currently legislated requires the
Information Commissioner to look for a Code developer from industry first.

In identifying an appropriate Code developer, the Information Commissioner must decide if


an entity has the appropriate expertise to develop a Code. The track record of many of these
companies and their representative bodies, clearly demonstrates that they do not have the
appropriate expertise when it comes to children’s best interests.

Recommendation 12: Serious consideration needs to be given about the expertise of the
social media sector and their industry representatives to develop an adequate code. Their
track record around data use, compliance with existing children’s privacy regulations and
code drafting demonstrates that they do not have sufficient expertise in children’s best
interests.

We note that this will not necessarily require any amendments to the Bill itself, but the
Information Commissioner must ensure that these considerations are centralised in the
Code development process.

Allowing the social media sector to draft the Code would not be in children’s best interests.
We do not believe that allowing the sector to draft the Code in the first instance would be a
case of ‘poacher turned gamekeeper’. It would simply be a case of ‘poacher writes Code’.
Keeping in mind that the ‘game’ in this metaphor is children and their data, this is deeply
worrying.

The social media sector’s practical expertise would be more appropriately harnessed through
consultation. Alongside this industry consultation, we would like to see comparable
engagement with civil society, children’s advocates and the Children’s Commissioner to
ensure the full range of expertise and experience necessary to develop this Code is
harnessed.

71
Zoe Samios & Lisa Visentin 2020 ‘ACMA: Tech giants' code to handle fake news fails to meet
expectations’ Sydney Morning Herald
www.smh.com.au/politics/federal/acma-tech-giants-code-to-handle-fake-news-fails-to-meet-expectatio
ns-20201026-p568oq.html

29
Recommendation 13: Civil society, children’s advocates and the Children’s Commissioner
should be included in the process of Code co-development.

We note that this will not necessarily require any amendments to the Bill itself, but the
Information Commissioner must ensure that this is centralised in the Code development
process.

Recognising children’s right to participate in the Code


co-development process

The strength of a co-regulatory approach rests on the ability to lean into all forms of expertise.
That has to include the children and young people’s expertise. This is also an essential part of
realising their best interests and right to participate.

We welcome the commitments already made by the Attorney General’s Office and Office of
the Australian Information Commissioner to meet and engage with young people directly,
and would like to see a more extensive consultation with children undertaken.

Recommendation 14: Children and young people must be involved in the


co-development of the Code.

We note that this will not necessarily require any amendments to the Bill itself, but the
Attorney General’s Department and Information Commissioner must ensure that this is
centralised in the Code development process.

30
Recommendations
1. The Code must clearly outline how the best interests principle should operate in
practice, and Code Developers must be given a clear set of necessary
expectations. This will not necessarily require any amendments to the Bill itself, but
the Attorney General’s Department and/or Office of the Information Commissioner
should issue the Code developers with a clear list of ‘necessary expectations’ that they
must include and go beyond. This outcomes-based approach should provide a floor
that ensures the Code delivers on the best interests in practice.

2. The Code must clearly outline that the best interests principle applies to uses of
children’s data that are central to the functioning of platforms. Code Developers
must be given a clear set of necessary systems to address. This will not necessarily
require any amendments to the Bill itself, but the Attorney General’s Department
and/or Office of the Australian Information Commissioner should issue the Code
developers with a clear list of ‘necessary applications’, and systems and functions, the
Code must address and go beyond.

3. The Code must ensure that it creates requirements for children’s best interests to
be considered ‘pre-emptively’ as well as in the ‘outcomes’ from platforms
collection and use of data. This will not necessarily require any amendments to the
Bill itself, but the Attorney General’s Department and/or Office of the Australian
Information Commissioner should inform Code developers that they expect that best
interests will be considered across the product development cycle. This includes for
example, data protection impact assessments and algorithmic audits.

To meet recommendations one, two and three, the necessary expectations, applications
and process that Code Developers must include and exceed, could for example be:

● The requirement to clearly codify children’s data rights, and set minimum expectations that
the Code:
○ Requires active, expressed consent — Data should only be processed where children
(and parents for younger children) have meaningfully consented and know about it
(except where it is in their best interests, e.g. medical emergencies). Meaningful
consent requires that, among other things, Terms of Service and Privacy notices must
be age-appropriate. This means a Code must stipulate that
■ Terms be published in plain speak, accessible to the youngest users (as per
26KCC in the Bill)
■ Terms must be enforced. Providers should live up to their terms of service
and privacy policies, and children should have a right of redress if they do not
■ There must be a clear process to ‘ make things right’ where terms are
breached. Children should be able to exercise their rights easily
○ Requires transparency around data processing — Children should be informed about
and have consented to every instance of data processing (except where it is in their
best interests). This requires purpose limitation.
○ Requires data minimisation and restricted data sharing — Only strictly necessary
data should be collected, and it should not be circulated (except where it is in their
best interests)
○ Requires accessible ways for children to request, access, correct and delete their data,
and request that it not be used (as per 26kC 4a in the Bill)
○ Requires default settings for under 18s accounts to be high privacy and low risk, and
making it clear about any risks associated with changing them

31
● Make explicit that children’s best interests must be the primary consideration in data
processing around:
○ Recommender systems and algorithms
○ Automated decision making and profiling
○ Digital marketing and commercial profiling
○ Testing and training for ‘persuasive’ design
○ Geolocation data
● Ensure children are considered in advance by requiring children’s data protection impact
assessments to be undertaken before any data about under 18 year olds is processing data.
These must be made available to the OAIC and public on request.
● Ensure that the outcomes from the collection and use of children’s data are adequately
monitored. For example, this may require algorithmic audits. These audits must be made
available to the OAIC and public on request.
● Ensure that children and young people are respected as digital citizens. This means services
can’t shut them out, or downgrade their service because it’s ‘too hard’ to meet their needs

These outcomes build on the Children’s Data Code Campaign and align with the rules that
16 & 17 year olds described when polled in April 2021.

4. The Code must ensure that reasonable steps to assure age for the purpose of
ensuring age-appropriate data handling practices are precautionary. If in doubt, it
must be assumed that data could belong to a child and handled accordingly. This
will not necessarily require any amendments to the Bill itself, but the Attorney
General’s Department and/or Office of the Information Commissioner should issue
the Code developers with guidance.

5. The Code must ensure that reasonable steps to assure age does not produce
barriers to children from low socio-economic households, or otherwise
disadvantaged young people. This will not necessarily require any amendments to
the Bill itself, but the Attorney General’s Department and Office of the Information
Commissioner should issue the Code developers with guidance.

6. The Code must ensure that reasonable steps to assure age must consider
children’s privacy and security. This will not necessarily require any amendments to
the Bill itself, but the Attorney General’s Department and/or Office of the Information
Commissioner should issue the Code developers with guidance.

The UK’s draft Age Assurance Minimum Standards Bill72 may be helpful in developing any
guidance for Code developers to achieve recommendations four, five and six.

7. The requirement for parental consent should be significantly revised to ‘take fair
and reasonable steps’ to obtain parental consent, and that in determining what is
considered fair and reasonable consideration must be given to children’s best
interests, their evolving capacity and not to unduly restrict access of children to
services they should reasonably be able to access. Section 26KC (b) of the draft Bill
should be amended to reflect this.

72
Age Assurance Minimum Standards Bill 2021, UK
https://bills.parliament.uk/publications/41683/documents/325

32
8. The Code must ensure that the requirement to obtain parental consent does not
present disproportionate barriers to children in alternative care arrangements.
This will not necessarily require any amendments to the Bill itself, but the Attorney
General’s Department and/or Office of the Information Commissioner should issue
the Code developers with guidance.

9. The Code must ensure that the requirement to obtain parental consent considers
privacy and security. This will not necessarily require any amendments to the Bill
itself, but the Attorney General’s Department and/or Office of the Information
Commissioner should issue the Code developers with guidance.

euConsent’s framework for obtaining parental consent in ways that maintain children's
rights73 may be helpful in developing any guidance for Code developers to achieve
recommendations eight and nine.

10. Requirements around the best interests principle and age assurance must be
applied to larger online platforms and data brokers. Section 26KC (e) & (f) of the
draft Bill should be amended to reflect this.

11. The definition of Large Online Platforms be expanded to include a proportional


provision for under 18 year olds. Section 6W4A of the draft Bill should be amended
to reflect this.

12. Serious consideration needs to be given about the expertise of the social media
sector and their industry representatives to develop an adequate Code. Their
track record around data use, compliance with children’s privacy regulations and
code drafting demonstrates that they do not have sufficient expertise in
children’s best interests. This will not necessarily require any amendments to the Bill
itself, but the Information Commissioner must ensure that these considerations are
centralised in the Code development process.

13. Civil society, children’s advocates and the Children’s Commissioner should be
included in the process of Code co-development. This will not necessarily require
any amendments to the Bill itself, but the Information Commissioner must ensure
that this is centralised in the Code development process.

14. Children and young people must be involved in the co-development of the Code.
This will not necessarily require any amendments to the Bill itself, but the Attorney
General's Department and Information Commissioner must ensure that this is
centralised in the Code development process.

73
Simone van der Hof & Sanne Ouburg 2021 Methods for Obtaining Parental Consent and Maintaining
Children’s Rights euConsent
https://euconsent.eu/download/methods-for-obtaining-parental-consent-and-maintaining-children-rig
hts/

33

You might also like