Professional Documents
Culture Documents
03social Media Platforms As Public Health Arbiters
03social Media Platforms As Public Health Arbiters
3.1 Introduction
3.1.1 Suicide Prevention and the Use of Technology
Suicide is a growing, global public health problem.1 No corner of the
world remains untouched by the increasing incidence and prevalence of
suicidal behaviour across the lifespan. Every forty seconds, an individual
takes their own life somewhere in the world. Suicide is the second lead-
ing cause of death among adolescents and young adults aged 15–26,2 and
the suicide rates among this age group have been on the rise over the past
decade.3 Recently, there has also been an alarming trend of increased sui-
cidal behaviour and suicide among children as young as five years old.4
1 This is an extended version of an article published in the Journal of Law and the Biosciences
where we argued that Facebook’s practices in this area should be subject to well-established
protocols such as an ethical review process and the explicit approval of its users in the form
of informed consent. We also proposed a fiduciary framework with the assistance of a
panel of external impartial experts from various fields and diverse backgrounds. See KL
Celedonia and others, ‘Legal, Ethical, and Wider Implications of Suicide Risk Detection
Systems in Social Media Platforms’ (2021) 8(1) Journal of Law and the Biosciences, https://
doi.org/10.1093/jlb/lsab021, accessed 23 May 2021.
2 P Rodríguez Herrero, A de la Herrán Gascón and V de Miguel Yubero, ‘The Inclusion of
Death in the Curriculum of the Spanish Regions’ (2020) Compare: A Journal of Comparative
and International Education, 1–19.
3 MF Hogan, ‘Suicide Prevention: Rising Rates and New Evidence Shape Policy Options’ in
The Palgrave Handbook of American Mental Health Policy (Springer 2020) 229–57.
4 AH Sheftall and others, ‘Suicide in Elementary School-Aged Children and Early Adolescents’
(October 2016) 138(4) Pediatrics e20160436, https://doi.org/10.1542/peds.2016-0436, accessed
23 May 2021.
68
5 World Health Organization, ‘Suicide in the World: Global Health Estimates’ (2019) World
Health Organization, https://apps.who.int/iris/handle/10665/326948, accessed 23 May 2021.
6 Ibid.
7 KM Shealy and others, ‘Delivering an Evidence-Based Mental Health Treatment to
Underserved Populations Using Telemedicine: The Case of a Trauma-Affected Adolescent
in a Rural Setting’ (2015) 22(3) Cognitive and Behavioral Practice, 331–44.
8 DM Hilty and others, ‘Telepsychiatry’ (August 2002) 16(8) Molecular Diagnosis and Therapy,
527–48, https://doi.org/10.2165/00023210-200216080-00003, accessed 23 May 2021.
illnesses are more accepted, stigma still exists, and as such, many individ-
uals with mental illness choose not to voluntarily disclose their disorders.
Non-health care entities like social media platforms revealing someone’s
mental state to public officials could therefore be construed as a violation
of an individual’s right to disability non-disclosure.
Another layer to the aforementioned issues with stigma and involun-
tary disclosure of mental illnesses is the possibility of the suicide detec-
tion algorithm resulting in false positives for suicide risk. As with the
mobile health apps for suicide prevention, these detection algorithms
were not (and have not been) properly researched before their implemen-
tation. Data on the accuracy of the algorithm are not publicly available,
and anecdotal accounts of the algorithms erroneously detecting suicide
risk when none was present are starting to surface. In countries where
mental illnesses are more accepted, these mistakes may mean little more
than extreme inconvenience and unnecessary resource expenditures, but
in countries where mental illnesses are still misunderstood and severely
stigmatised, such a mistake could ruin an individual’s life.
With over 2.6 billion users worldwide and an emerging trend of users
live-streaming suicide attempts, one might argue that social media plat-
forms like Facebook have a moral obligation to develop suicide detection
algorithms to protect their users. To refrain from doing anything to pre-
vent suicide among its users could itself be viewed as unethical. However,
given that research is revealing that mental health issues like depression,
which is a risk factor for suicide, are associated with social media use,18
one may wonder if suicide detection algorithms on social media platforms
are nothing more than these platforms trying to fix a problem that they
themselves created or had a role in exacerbating. In addition to possibly
contributing to mental illnesses that put individuals at risk for suicide,
the trend of live-streaming suicide attempts on social media creates the
possibility of behavioural contagion in which viewers of these videos may
mimic the behaviour they witness.19 Furthermore, studies show that there
is a significant positive, predictive association between social media use
and loneliness: the more time an individual spends using social media,
18 MG Hunt and others, ‘No More FOMO: Limiting Social Media Decreases Loneliness and
Depression’ (December 2018 Guilford Publications Inc), https://guilfordjournals.com/
doi/abs/10.1521/jscp.2018.37.10.751, accessed 23 May 2021.
19 M Westerlund, G Hadlaczky and D Wasserman, ‘Case Study of Posts before and after a
Suicide on a Swedish Internet Forum’ (December 2015) 207(6) British Journal of Psychiatry,
476–82, https://doi.org/10.1192/bjp.bp.114.154484, accessed 23 May 2021.
the more lonely they report feeling;20 loneliness is often associated with
suicidal behaviour.21
20 M Savci and F Aysan, ‘Relationship between Impulsivity, Social Media Usage and
Loneliness’ (March 2016) 5(2) Educational Process: International Journal, 106–15, https://
doi.org/10.12973/edupij.2016.52.2, accessed 23 May 2021.
21 A Stravynski and R Boyer, ‘Loneliness in Relation to Suicide Ideation and Parasuicide: A
Population-Wide Study’ (June 2005 Guilford Publications Inc), https://guilfordjournals
.com/doi/abs/10.1521/suli.31.1.32.21312, accessed 23 May 2021.
22 JF Childress and TL Beauchamp, Principles of Biomedical Ethics (Oxford University Press
2001).
23 R Rawbone, ‘Principles of Biomedical Ethics’ (January 2015) 65(1) 7th Edition Occupational
Medicine (Lond) 88–89, https://doi.org/10.1093/occmed/kqu158, accessed 23 May 2021.
25 N Singer, ‘In Screening for Suicide Risk, Facebook Takes On Tricky Public Health Role’,
New York Times, June 2019, www.nytimes.com/2018/12/31/technology/facebook-suicide-
screening-algorithm.html, accessed 23 May 2021.
26 EH Morreim, ‘Playing Doctor: Corporate Medical Practice and Medical Malpractice’
(1998) 32 University of Michigan Journal of Law Reform, 939.
27 CO Schneble, BS Elger and D Shaw, ‘The Cambridge Analytica Affair and Internet
Mediated Research’ (August 2018) 19(8) EMBO Reports, https://doi.org/10.15252/
embr.201846579, accessed 23 May 2021.
28 BL Mishara and DN Weisstub, ‘The Legal Status of Suicide: A Global Review’ (2016) 44
International Journal of Law and Psychiatry, 54–74.
29 A Freilich, ‘Fallen Soldier: Military (In)justice and the Criminalization of Attempted
Suicide After U.S. v. Caldwell’ (September 2014) 19 Berkeley Journal of Criminal Law, 74,
accessed 8 May 2020.
30 World Health Organization, ‘Preventing Suicide: A Global Imperative’ (May 2014), www
.who.int/mental_health/suicide-prevention/world_report_2014/en, accessed 8 May 2020.
31 ACH Szeto and KS Dobson, ‘Reducing the Stigma of Mental Disorders at Work: A
Review of Current Workplace Anti-Stigma Intervention Programs’ (June 2010) 14(1–4)
Applied and Preventive Psychology, 41–56, www.sciencedirect.com/science/article/pii/
S0962184911000047, accessed 23 May 2021, https://doi.org/10.1016/j.appsy.2011.11.002,
accessed 23 May 2021.
32 CC Young and SJ Calloway, ‘Assessing Mental Health Stigma: Nurse Practitioners’
Attitudes Regarding Managing Patients with Mental Health Disorders’ (March 2020)
Journal of the American Association of Nurse Practitioners, https://doi.org/10.1097/
JXX.0000000000000351, accessed 23 May 2021.
33 C Heginbotham, ‘UK Mental Health Policy Can Alter the Stigma of Mental Illness’ (1998)
352(9133) The Lancet, 1052–53.
34 F Mascayano, JE Armijo and LH Yang, ‘Addressing Stigma Relating to Mental Illness in Low-
and Middle-Income Countries’ (March 2015) 6 Front Psychiatry, https://doi.org/10.3389/
fpsyt.2015.00038, accessed 23 May 2021.
35 BT Venkatesh and others, ‘Perception of Stigma Toward Mental Illness in South India’
(2015) 4(3) Journal of Family Medicine and Primary Care, 449.
36 Mascayano, Armijo, and Yang, ‘Addressing Stigma Relating to Mental Illness in Low- and
Middle-Income Countries’.
37 T Shibre and others, ‘Perception of Stigma among Family Members of Individuals with
Schizophrenia and Major Affective Disorders in Rural Ethiopia’ (June 2001) 36(6) Social
Psychiatry and Psychiatric Epidemiology, 299–303, https://doi.org/10.1007/s001270170048,
accessed 23 May 2021.
44 M Thielking, ‘Experts Raise Questions about Facebook’s Suicide Prevention Tools – STAT’
(February 2019), www.statnews.com/2019/02/11/facebook-suicide-prevention-tools-ethics-
privacy, accessed 8 May 2020.
45 EK Mościcki, ‘Gender Differences in Completed and Attempted Suicides’ (1994) 4(2)
Annals of Epidemiology, 152–58.
46 AG Dijkstra and LC Hanmer, ‘Measuring Socio-Economic Gender Inequality: Toward
an Alternative to the UNDP Gender-Related Development Index’ (2000) 6(2) Feminist
Economics, 41–75.
47 FT Alloh and others, ‘Mental Health in Low and Middle Income Countries (LMICs):
Going Beyond the Need for Funding’ (2018) 17(1) Health Prospect, 12–17.
48 Y Cohen and others, ‘Physical and Sexual Abuse and Their Relation to Psychiatric Disorder
and Suicidal Behavior among Adolescents Who Are Psychiatrically Hospitalized’ (1996)
37(8) Journal of Child Psychology and Psychiatry, 989–93.
3.5 Conclusion
Suicide detection algorithms implemented on social media platforms
have the potential to prevent death by suicide. With billions of individu-
als across the globe using social media platforms daily, the reach of such
a public health intervention is unprecedented. However, the usefulness,
and indeed the appropriateness, of suicide detection algorithms on social
media platforms needs to be assessed within certain ethical considerations,
as well as the cultural context in which the algorithms are deployed. By
acting as a health care provider through the collection and use of personal
health information via the suicide detection algorithm, social media plat-
forms using this technology should be held to the same ethical standards
as legitimate health care providers.
Acknowledgements
The research for this chapter was supported by a Novo Nordisk Foundation
grant for a scientifically independent Collaborative Research Program in
Biomedical Innovation Law (grant agreement number NNF17SA0027784)
and by the Alexander von Humboldt-Stiftung, Bonn, Germany.