Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Ethics in Technology for Clinical Psychology

Thomas D Parsons, Computational Neuropsychology & Simulation (CNS), University of North Texas, Denton, TX, United States
© 2022 Elsevier Ltd. All rights reserved.

Introduction 1
Ethical and Legal Issues 1
Legal Issues Related to Technology Use in Clinical Psychology 1
Ethical Principles for Clinical Psychologists Using Technologies 2
Beneficence and Nonmaleficence 2
Privacy and Confidentiality 3
Boundaries 3
Special Populations 3
Assessment Technologies 4
Considerations for Ethical Decision Making 4
Technologies Extending Cognition 5
Ethical Considerations for Technologies Extending Cognition 8
Future Research Directions 8
Clinical Applications and Recommendations 9
Telepsychology and eTherapy 9
Neuroethics of Enhancements 10
Conclusion 11
References 12

Introduction

Innovations in information and communication technologies are now available for clinical psychologists interested in communi-
cating (visually and verbally) with patient’s in most parts of the world. Communication is important for therapeutic alliances and
progress in communication technologies have significantly affected clinical psychology. Clinical psychologists are demonstrating
increased attention to technologies that may aide delivery of services and practice management (Norcross et al., 2002, 2013).
With these technological advances come ethical challenges that may hinder the process, efficacy, and even security of psychotherapy.
Of particular concern is the noteworthy rise in online therapy delivery, professional social networking sites for therapists, continuing
education, and therapist search engines. For clinical psychologists this may result in professional and personal overlap with patients.
While there are steps that clinical psychologists can take to separate their private lives from their clients, there is potential for non-
professional online activities do occur in the online space shared with their clients. This shared space can lead to various ethical
dilemmas. While professional ethical guidelines have been developed that offer direction for offline client-therapist interactions,
guidelines for online interactions will need to continually develop in line with the rapid growth and proliferation of technologies.
Most clinical psychologists received limited training in information technologies and may not be prepared for technology-
related ethical challenges (e.g., privacy, electronic security, legal implications). Professional ethical guidelines provided by the Amer-
ican Psychological Association (2013a,b,c) and International Society for Mental Health Online (2009) may need to be updated as
new technologies emerge in new sociocultural contexts. Moreover, continuing education for clinical psychologists is needed because
technologies update and change often. Practice guidelines developed for offline (face-to-face therapy) may have limited generaliz-
ability to clinical practices online. This chapter considers potential ethical concerns for clinical psychologists and their interactions
with clients in the digital era. This will include considerations about whether a client’s disposition and/or situation call for eTherapy.
First, legal and ethical issues related to privacy (e.g., confidentiality), electronic security, and boundaries are discussed. Here, the
relevance and application of ethical codes and guidelines are emphasized. Next, ethical challenges related to technologies that
may extend cognitive, affective, and social processes will be considered (Parsons, 2019a,b).

Ethical and Legal Issues


Legal Issues Related to Technology Use in Clinical Psychology
There are existing regulatory structures that clinical psychologists should consider when using technologies in their research and
practice. The first is the Health Information Portability and Accountability Act (1996; HIPAA). The regulatory framework, HIPAA,
requires clinical psychologists to utilize sufficient protections to safeguard their patient’s digital medical privacy. For instance,
clinical psychologists desiring to deliver telepsychological assessments and interventions should adhere to HIPPA guidelines
when evaluating the risks associated with various software, hardware, and network platforms. The second regulatory structure is

Reference Module in Neuroscience and Biobehavioral Psychology https://doi.org/10.1016/B978-0-12-818697-8.00007-8 1


2 Ethics in Technology for Clinical Psychology

the Health Information Technology for Economic and Clinical Health Act (HITECH Act, 2009). The HITECH Act focuses on the
privacy and security risks associated with electronic transmission of health information. Clinical psychologists may need to consult
the HITECH (2009) when they are responsible for signing business associate agreements. For example, a clinical psychologist may
need to enter data into third party cloud storage services. A third regulatory structure is the Family Educational Rights and Privacy
Act (FERPA) of 1974. The FERPA (or Buckley Amendment) Act regulates access to information (e.g., educational records and data)
by public entities such as prospective employers, publicly financed educational organizations, and foreign governments. Clinical
psychologists working with college students will want to consider FERPA.
Clinical psychologists need to garner and maintain an understanding of American Psychological Association guidelines, codes,
and licensure jurisdiction of use. This includes a thorough reading and through and understanding of HIPAA, HITECH, and FERPA.
Moreover, clinical psychologists can attend technology-oriented workshops and continuing education programs that focus on legal
considerations. In some cases, psychologists will need to consult with attorneys who specialize in healthcare policy and privacy.
Finally, psychologists need to be able to describe the legal protections that patients/participants have when data are created, as
well as detail what regulations apply to psychological work.

Ethical Principles for Clinical Psychologists Using Technologies


For most clinical psychologists, training in ethical issues typically involved a course (or perhaps a handful of courses highlighting
particular cases, the APA ethics codes, which are developed from four principles established by Beauchamp and Childress (2001):
autonomy; beneficence; nonmaleficence; and justice (see Table 1).
Typically, clinical psychologists in training our taught about the Nuremburg Code (Allied Control Council, 1949), the World
Medical Association’s Declaration of Helsinki (1964), and the United States. National Commission for the Protection of Human
Subjects of Biomedical, & Behavioral Research (1978). From the Ethical Principles and Guidelines for the Protection of Human Subjects
Research (i.e., Belmont report; Department of Health, 2014), clinical psychologists are informed of three principles that serve as the
foundation for several contemporary ethical guidelines applied to clinical psychologists’ research and practice: respect for persons,
beneficence, and justice (Office for Human Research Protections [OHRP], 1979). Although some terminological variants can be
found in these codes and guidelines, all contain the ethical principles (see Table 1) of autonomy (i.e., free will or agency); benef-
icence (i.e., mercy, kindness, and charity); nonmaleficence (i.e., do no harm): and justice (i.e., fair distribution of benefits and
burdens).
Practice guidelines that have been developed for various professional organizations. Examples include American Psychological
Association, 2013a,b,c; American Telemedicine Association, 2009; American Counseling Association, 1999; International Society of
Mental Health Online, 2009; National Board for Certified Counselors, 1997. For clinical psychologists emphasis is likely placed
upon the American Psychological Association’s Ethical Principles of Psychologists and Code of Conduct (2002) that includes
the following five principles: (1) beneficence and non-maleficence (consistently consider costs and benefits; protects from harm;
and aim to optimize goodness); (2) fidelity and responsibility (this includes professionalism; consistent mindfulness of obligation
to society); (3) integrity (conscientiously truthful); (4) justice (consistently treat persons in a fair manner); and (5) respect for
patient’s rights and dignity (protect rights like privacy and confidentiality). Regardless of the chosen set of guidelines, each intends
to offer applicable standards for the use of technologies to provide of mental health services. Furthermore, each set of guidelines
underscores the necessary knowledge of the technical facets of the technology for safeguarding their patients (e.g., privacy settings
and encryption). Moreover, there is the challenge of causing boundary uncertainty when using social media (e.g., following on
Twitter; Facebook “friending”) and impractical expectations for email communications.

Beneficence and Nonmaleficence


Clinical psychologists may encounter ethical challenges when conducting etherapy with a patient who lives in a geographically iso-
lated area. During eTherapy (i.e., telepsychology) the clinical psychologist typically provides services via electronic media (Camp-
bell et al., 2018; Maheu et al., 2005). For the clinical psychologist, there is the dilemma of taking the patient on without first having
a face-to-face assessment. In doing this, the clinical psychologist may fail to notice likely risk factors that contraindicate distance

Table 1 Ethical principles pertaining to clinical psychology

Moral
Principle Rights Concern Practice considerations

Autonomy (Voluntas aegroti Right of patient to choose or refuse Agency; free will Informed consent; competence to
suprema lex) treatment consent
Beneficence (Salus aegroti Act in the best interest of the patient Mercy, kindness, and charity Benefits:
suprema lex) Effectiveness
Nonmaleficence (Primum nil First, do no harm Do no harm Risks:
nocere) Side effects
Justice (Iustitia) Justice, fairness and equality Fair distribution of benefits and Rationing and prioritizing
burdens
Ethics in Technology for Clinical Psychology 3

therapy (nonmaleficence). On the other hand, if the clinical psychologist does not take the patient on, then the psychologist may
not meet the obligation of choosing the well-being of the patient (beneficence). This tension between the leading ethical principles
of beneficence (clinical psychologist’s emphasis on patient benefit) and nonmaleficence (clinical psychologist’s avoidance of harm).
According to Gupta et al. (2016), the therapist should first meet with a patient face-to-face using four criteria to inform whether the
patient would benefit from video teleconference-based therapy: (1) nature and severity of patient’s distress; (2) patient’s access to
and comfort with technologies; (3) patient’s access to a private and confidential space; and (4) patient’s capacity for online
communication.
These four points can help clinical psychologists balance patient benefits while minimizing harm. It is important to note that just
because a patient has access to and is at ease with technologies (in a private space) used for eTherapy, the nature and the severity of
the patient’s presentation may still contraindicate eTherapy. Patients who present with severe personality disorder, psychotic
disorder, suicidality, and/or homicidality may not be good candidates for etherapy (Ragusea and VandeCreek, 2003). Finally,
even if all the above check out, there is still the issue of the client’s communication abilities. Video teleconferencing and other forms
of eTherapy tend to rely a great deal on verbal expression. As a result, there is a risk of missing communication cues.

Privacy and Confidentiality


Telepsychology (e.g., eTherapy; online research) may limit privacy and confidentiality (Lustgarten, 2015; Lustgarten and Colbow,
2017). The extensive array of privacy policies and terms of service found in third-party providers (e.g., Facebook, Twitter, Apple,
Google, Microsoft). Information technology vulnerabilities are evident in electronic communication records, electronic patient
data transfer; informational notices, and patient waivers. Professional organizations normally consign culpability to the service
providers (see for example American Psychological Association, 2017; Force and Initiative, 2013). It is important that the clinical
psychologist use HIPPA compliant platforms. Moreover, clinical psychologists are responsible for ensuring that patients are
informed of the limitations of technologies used in the therapeutic process and the limits to patient confidentiality that may occur
when information is transmitted electronically.
Moreover, securing electronic data transmissions from third party interception, without consent by the user, requires therapists
to encrypt data transmission (Elhai and Hall, 2016). Psychologists also need to ensure that physical electronic devices are password-
protected. This will safeguard the patient’s meta-data (e.g., Email addresses; phone numbers) and confidential information
(voicemails and other communications; Elhai and Hall, 2016). Parsons et al. (2017) offer practice parameters to allay potential
confidentiality breaches. In addition to explaining software and hardware platforms that can impact telepsychological practices,
they delineate optimal procedures that developers and clinical psychologists can use to reduce error when using technologies. It
is imperative that clinical psychologists use platforms from developers who make available bench test results for their software’s
performance on various devices and minimum specifications (documented in manuals).

Boundaries
Clinical psychologists should take into account the potential technology-related boundary violations. Telepsychology and eTherapy
occur in a shared online space (e.g., social media; social networking sites; Internet) where a patient may choose to “friend” a ther-
apist on Facebook or “follow” on Twitter. Clinical psychologists and/or patients may search for information about each other online
(Kolmes, 2012; Sabin and Harland, 2017; Zur, 2008). For example, Facebook’s graph search (Facebook, 2014) allows users to
search for any publicly available data (e.g., comments on public content posted by third parties) about a Facebook member. While
these services can be important tools for research, they also bring increased potential for boundary violations (Gamble and Morris,
2014; Sabin and Harland, 2017).
Social networking ethics are especially important for psychologists providing services in rural settings. Clinical psychologists
working with patients in rural areas need to balance transparency and disclosure while maintaining boundaries. Psychologists prac-
ticing in the digital age should think about the world as being smaller and their patients as more local. This means that psychologists
need to investigate their privacy settings to prevent requests and follows from patients (Lannin and Scott, 2013). Moreover, psychol-
ogists should consider using a policy that delineates expectations for social media contact.

Special Populations
Ethical telepsychology research and eTherapy interventions with children, older-aged participants, and some clinical populations is
of great significance to clinical psychologists who work with these cohorts. Clinical psychologists must be ethically vigilant when
using technologies to work with persons having limited understanding (e.g., children, young people, older adults, and some clinical
participants). Two important ethical issues for working with special populations and technology are: informed consent and protec-
tion. Clinical psychologists must consider whether they should obtain parental/legal guardian/caregiver consent only, or should
participants be asked for their consent? Some psychologists contend that children should be invited to be research participants
(Woodhead and Faulkner, 2008). The APA Code of Ethics asserts that persons who are legally incapable of giving informed consent
(e.g., children, older adults) should still be asked to give their assent (APA, 2010). Another issue is protection of special populations.
It is crucial that work with persons from these groups be ethical, sensitive, respectful, and protected. Moreover, they have the same
rights of withdrawal from participation. They should also have the same rights related to the research data they provide. They should
also have the same degree of confidentiality and privacy that others experience, with the added provision that clinical psychologists
will need to handle issues of disclosure related to harm as and when they arise (see Parsons, 2019a,b).
4 Ethics in Technology for Clinical Psychology

Research is needed to see how various clinical cohorts will react to virtual reality exposure therapy (VRET). For some patient
cohorts it may not add anything to the much less expensive face-to-face therapy. For example, a veteran experiencing combat stress
symptoms may have a therapeutic misconception that VRET will be better than traditional interventions simply as result of its
uniqueness. A randomized clinical trial suggest that these expectations may be misguided. The randomized clinical trial was
completed to assess the usefulness of VRET through comparison to prolonged exposure therapy (i.e., traditional talk therapy)
for the treatment of posttraumatic stress disorder in a large cohort (N ¼ 162) of active duty Soldiers with combat-related trauma
(Reger et al., 2016). Results revealed that VRET was in fact inferior to talk therapy using prolonged exposure. Prolonged exposure-
based talk therapy was actually significantly superior at alleviating symptoms at 3- and 6-month follow-up. Given the possibilities
of therapeutic misconceptions, researchers and clinicians using virtual reality should be aware of proven practices for neutralizing
therapeutic misconceptions. More research is need in this area.

Assessment Technologies
With precipitous technological innovations (e.g. virtual reality; tablets; iPhones) and proliferation, there is a consistent demand for
identifying optimal specifications while minimizing potential sources of error (Bauer et al., 2012; Cernich et al., 2007; Parsons et al.,
2018a,b). Ethical concerns have been raised about the suitability of computerized assessments. Research is needed to evaluate
whether the patient’s perception of and responses to the computer-generated stimuli are significantly similar to perceptions of
and responses to traditional paper-and-pencil measures. Moreover, patients have differing levels of familiarity with computer inter-
faces (e.g. mouse, keyboard). The concern is whether a computerized assessment will produce a different result or assess a dissimilar
capacity than its paper-and-pencil equivalent. Each clinical psychologist must weigh the advantages and disadvantages of adding
computerized assessments for their patients (Parsons, 2016).
Virtual reality-based neuropsychological assessment devices have heightened computational abilities for efficient administration
of assessments and treatments: stimulus presentation, automated response logging, and data analytic processing (Parsons, 2015,
2016; Parsons and Kane, 2017). Head mounted display do pose an ethical concern for use with some patients (especially older-
aged populations) because adverse effects (e.g., cybersickness) can occur. Important ethical principles include beneficent optimiza-
tion of virtual environment benefits and nonmaleficence aspects of virtual environments that could cause harms and risks for the
users. Cybersickness is a potential harm. Endeavors to enhance beneficence involve minimizing cybersickness. For certain popula-
tions, it may be better to present simulations via desktop platforms.
There is also the tension between beneficence and autonomy in applied ethics. Virtual reality can impact the brain and manip-
ulate psychological experiences (see Parsons, 2017). While it has potential for helping some cohorts via VRET (beneficence), it can
also threaten autonomy via changes to the person’s brain. Aardema et al. (2010) contend that virtual reality effects the user in
a manner comparable to symptoms found in dissociative disorders (depersonalization and derealization).

Considerations for Ethical Decision Making


An important consideration for telepsychology is the development of frameworks for the ethical use of technologies. Torous and
Roberts (2017) outline a series of steps that can be used to enhance ethical decision making in telepsychology. First, question
whether technology use provides a benefit to the patient. If there are benefits, then consider potential risks. For example, the tech-
nology may be contraindicated with patients having chronic mental health conditions, high risk of relapse or recurrence, and poten-
tially limited insight or judgment. Some patients may not have the capacity needed for providing informed consent to use the
technology. Other patients may have decisional capacity to consent, but also a history of limited impulse control. Torous and Rob-
erts suggest a spectrum of vulnerability to harm in the therapeutic relationship that may change over time.
Next, Torous and Roberts consider the ethical tensions in confidentiality between mental health service providers and corpora-
tions providing technologies. This requires a dialog between the psychologist and the patient about confidentiality risks with
various social media platforms. Companies (e.g., Facebook, Apple, Microsoft) often collect user data, archive it, share it, and
even sell it to other companies. Psychologists should advise patients to review the terms of service for their media applications.
Finally, the psychologist is directed to both introduce and sustain an ongoing discussion with the patient about whether the tech-
nologies align with treatment goals and expectations.
Judgments and decision-making include controlled and automatic processes. Controlled processes (like those described by
Tourous and Roberts) are cognitive processes related to conscious awareness, require effortful control and intention, as well as
capacity for inhibition. An example might be learning to read English as a second language. Automatic processes (overlearned
behaviors (e.g., reading English text as a native English speaker) are not necessarily in conscious awareness and occur spontaneously.
These two systems have been described as the controlled C-system (“c” in reflective) and the automatic X-system (“x” in reflexive;
Lieberman, 2007).
Learning to drive an automobile is an example of dual processing. Learning to drive requires effortful use of controlled processes
to consciously attend to operating the steering wheel, accelerator, and breaks. Moreover, controlled conscious attention is deliber-
ately focused on the road, traffic signals, and other people (in and out of cars). During this learning period, the attentional demands
involve even greater controlled processing when navigating a new city, when other cars and pedestrians are near, when traffic signals
change, and when others are in the car. After becoming an experienced driver, many aspects of driving become automated. The expe-
rienced driver can navigate, react to changes in traffic conditions, and even carry on a conversation without consciously processing
many of the tasks. When traffic conditions change dramatically, even the experienced driver may be forced to consciously focus
attention once again using controlled cognitive processes.
Ethics in Technology for Clinical Psychology 5

Moral judgments and decision making are being studied using virtual reality simulations of moral dilemmas. Participants
immersed in virtual reality-based moral scenarios experience significant alterations in subjective experiences, behaviors, and psycho-
physiological responding. Mel Slater et al. (2006) used a virtual reality-based replication of Milgram’s (1963) obedience experi-
ment. Results revealed that the participants who saw and heard the virtual human female tended to respond (subjective,
behavioral and physiological levels) as if the virtual human and experiment were real. There are also virtual reality simulations
of the classic Trolley and Footbridge dilemmas (see Parsons, 2019a; Navarrete et al., 2012; Pan and Slater, 2011; Patil et al.,
2014; Skulmowski et al., 2014). Participants are immersed in a virtual environments, in which a runaway trolley is heading for
five immobile people on its tracks. If the participant does not pull a lever to move the trolley to another track (where there is
only one person), it will kill the five virtual humans. If the participant lulls the lever, then they are choosing for a person to die.
Results suggest that participants immersed these virtual reality scenarios find it challenging to choose that one answer (kill the
one person to save the five others) is better than the other (saving the one person, but let the five die). A dual-process theory
has been proposed to describe the processes involved in resolving trolley dilemmas. Both controlled cognitive responses and auto-
matic affective responses perform essential roles in moral decision making (see also Greene et al., 2004, 2008). While judgments of
correct actions when immersed in these virtual environments tend to involve controlled (e.g., cold) cognitive processes, the decision
to apply direct physical force triggers automatic (e.g., hot) affective responses.
Stanovich (2012) proposes a tripartite model of cognitive processing that involves autonomous (automatic) processing that
entails mostly rapid and non-conscious heuristic utilization. Stanovich differentiated two systems of controlled processing (slow
and effortful) by developing two sectors: (1) reflective processing characterizes cognitive processing goal-relevant beliefs, and opti-
mizes action choices; and (2) algorithmic processing that contains rules, strategies, and procedures that can be retrieve from
memory to aid problem solving. The tripartite model would be enhanced via increased emphasis upon somatic markers that weight
outcomes to bias future choices of action. It is important to note that emotions are involved in the model, but greater inclusion of
the somatic marker model would be helpful (Bechara and Damasio, 2005). Moreover, the tripartite model includes an autonomous
processor that automatically induces action and a reflective processor that can inhibit the autonomous processor so that the algo-
rithmic processor can compute the ultimate reaction after weighing the benefit of delayed gratification. Greater inclusion of somatic
markers would enhance explanations of the valuation and weighing of benefits. During decision making, emotions are experienced
(e.g., gut feeling; hunch) that result in a somatic marker that weights outcomes to bias future choices of action. Therefore, the
somatic marker is believed to play a role in decision making via its biasing of available response selections. During decision making
the person experiences somatic sensations before actual outcomes of various possible alternatives.

Technologies Extending Cognition

Social media technologies and algorithmic devices (e.g., smartphones) have the potential to extend our cognitive, affective, and
social processes beyond the wetware of our brains. An important component for our understanding of the cognitive, affective,
and social processes found in telepsychology is the notion that technology is an extension of our cognitive processes (Parsons,
2019a, b). For example, smartphones help us recall information, compare new information to old, calculate, navigate, and trans-
latedthey allow us to access an abundance of information and guidance. Some of this information is accessible publicly via internet
sites that can inform psychologists of everything from the addresses and menus at area restaurants to answers arising from debates
during dinner conversations. Other information may include more personal information, such as contacts, emails, text messages,
posts, calendar appointments, and even logs of activities (purchases, articles and books read, films viewed, number of steps taken on
a given day, calories, and so forth).
According to Daniel Dennett (1996, p. 134–135), our remarkable evolutionary success is less a factor of our large frontal lobes,
but our capacity for extending our cognitive processes into the environment with which we interact. Hence, our enhanced intelli-
gence is due to our habit of offloading as much as possible of our cognitive tasks into the environment itselfdextruding our minds
(that is, our mental projects and activities) into the surrounding world, where a host of peripheral devices we construct can store,
process and re-represent our meanings, streamlining, enhancing, and protecting the processes of transformation that are our
thinking. This widespread practice of off-loading releases us from the limitations of our animal brains.
This idea is known as “extended mind” (also known as “extended cognition”) and characterizes human cognizing as comprising
complex feedback (as well as feedforward and feed-around) loops among brain, body, and the peripheral world (see Clark and
Chalmers, 1998; Clark, 2008; see also Menary, 2010). With the “extended mind” theory, a clinical psychologist may consider cogni-
tive, affective, and social processes as occurring beyond wetware (i.e., one’s brain) to software and hardware used with brains.
Furthermore, cognition (as well as affect, social comportment) can be regarded as processes conducted by a system that is coupled
with the environment (Clark, 2008; Clark and Chalmers, 1998). For example, smartphones and the Internet form an extended
cognitive system that accomplishes operations that would otherwise be realized via internal brain-based processes. The extended
mind hypothesis employs a “parity principle” as follows:
If, as we confront some task, a part of the world functions as a process which, were it to go on in the head, we would have no
hesitation in recognizing as part of the cognitive process, then that part of the world is (so we claim) part of the cognitive process
(Clark and Chalmers, 1998, p. 8).
An early example of the parity principle involved fictional characters (Inga and Otto) who must find their ways to a museum on
53rd street. Inga is able to facilely access her internal memory stored in her brain to remember the appropriate directions to the
6 Ethics in Technology for Clinical Psychology

museum. Otto has limited recall of directions due to Alzheimer’s disease. Otto needs to depend on directions stored in a notebook
that function as an external supporter for his internal brain-based memory processes. Inga and Otto both reach the museum, irre-
spective of the fact that for Inga, the memory was stored internally in her brain, and for Otto stored externally in his notebook. The
information-processing loops that extend beyond the neural sphere to incorporate components of our social and technological
environments. Clark and Chalmers proffer four “trust and glue” criteria for objects that may act as candidate extenders of cognition:
1. Constancy. Otto’s notebook is readily available when he wants it.
2. Facility. Otto’s effort and time to recover information from the notebook are negligible.
3. Trust. Otto’s trust of information written in his notebook is automatic.
4. Prior endorsement. Otto has, in the past, endorsed information found in the notebook. This is apparent in the fact that he
recorded the information in the notebook for future use.
None of the four criteria are required to hold unconditionally. For example, “constancy” does not demand that Otto’s notebook be
on hand in all situations. Instead, the notebook should be accessible when Otto needs it. Furthermore, “facility” signifies a close
coupling of the user to an external aide.
Developing technologies can, in certain situations, be understood as part of a user’s cognitive functioning. Novel technologies
representing the extended-mind thesis include the Internet, smartphones (e.g., iPhone; Android), iPads, tablets, smart watches,
Google Glass, and many others. In a forward to Andy Clark’s (2008) Supersizing the Mind, David Chalmers describes his iPhone
as follows:
A month ago, I bought an iPhone. The iPhone has already taken over some of the central functions of my brain. It has replaced
part of my memory, storing phone numbers and addresses that I once would have taxed my brain with. It harbors my desires: I call
up a memo with the names of my favorite dishes when I need to order at a local restaurant. I use it to calculate, when I need to figure
out bills and tips. It is a tremendous resource in an argument, with Google ever present to help settle disputes. I make plans with it,
using its calendar to help determine what I can and can’t do in the coming months. I even daydream on the iPhone, idly calling up
words and images when my concentration slips (Chalmers, 2008, p.1)
Smart technologies may, in some situations, realize a user’s cognitive states and beliefs external to the physical boundaries of the
brain (Clark, 2008). This extension of cognitive processes external to the brain necessitates that cognitive processes not be fully
reduced to brain processes.
The extended mind theory can be applied to the sociotechnical context of the Web (i.e., the Internet) and eTherapy. A “Web-
extended mind” perspective views the Internet as a mechanism for coupling mental states and processes with Internet (Smart,
2012). The Internet-based extension of our cognitive processes is apparent with mobile internet technologies. The Internet’s
immense information base is a simple click or utterance away. While the early metaphors (e.g., Otto’s notebook) gave emphasis
to external memory storage, internet enabled iPads, tablets, and smartphones go past memory aides to robust mobile computation
devices. Internet empowered mobile technologies make available opportunities for investigating the interactions (cognitive, affec-
tive, and social) of clinical psychologists and patients as they participate with a global workspace and connected knowledge bases.
Smart technologies, according to the parity principle, that help clinical psychologists conduct eTherapy, telepsychology practices,
perform controlled mathematical operations faster than can be performed with our brains alone should be considered part of our
cognitive processes. Chalmers (2008) describes it this way:
The dispositional beliefs, cognitive processes, perceptual mechanisms, and moods considered above all extend beyond the
borders of consciousness, and it is plausible that it is precisely the nonconscious part of them that is extended. I think there is
no principled reason why the physical basis of consciousness could not be extended in a similar way. It is probably so extended
in some possible worlds: one could imagine that some of the neural correlates of consciousness are replaced by a module on
one’s belt, for example (p. xiv).
The extension of cognitive processes beyond the brain means that cognitive processes cannot be fully reduced to brain processes
(Levy, 2007). People do not rely solely on their brains to perform activities of daily living. Instead, they are extending cognitive
processes using the algorithmic devices that act as technologies of the extended mind (Reiner and Nagel, 2017; Fitz and Reiner,
2016; Nagel2016; Reiner and Nagel, 2017; Nagel and Reiner, 2018; Parsons and Duffield, 2019). It is important to note that
not every algorithmic function carried out by technologies external to the brain qualify as a technology of the extended mind
(TEM). Instead, there is a comparatively continuous interface between brain and algorithm such that the person experiences the
algorithmic technology as an extension of mind.
It is not the case that every algorithmic function carried out by devices external to the brain qualifies them as a TEM, but rather
that there is a relatively seamless interaction between brain and algorithm such that a person perceives of the algorithm as being
a bona fide extension of a person’s mind. This raises the bar for inclusion into the category of algorithms that might be considered
TEMs. It is also the case that algorithmic functions that do not qualify as TEMs today may do so at some future point in time and vice
versa (Reiner and Nagel, 2017, p. 110).
Reiner and Nagel (2017) ask the reader to imagine an unexperienced Uber driver (i.e., company that allows non-professionals to
act as chauffeurs using their own automobiles) who utilizes a global positioning system (GPS) to navigate New York City. Although
the driver’s GPS is performing the computations external to the driver’s brain, it is not so far a technology of the extended mind.
They argue that it will not be a technology of the extended mind until the algorithmic computations and the driver’s dependence
upon them are effortlessly integrated with the driver’s cognitive processes. This enhances the parity principle (found in the work of
Ethics in Technology for Clinical Psychology 7

Clark and Chalmers) by specifying the features (i.e., automatic processing of integrated algorithms) needed for a technology to be
an extension of a person’s mind.
Fig. 1 presents a framework for understanding technologies of the extended mind (for a more full description, see Parsons and
Duffield, 2019).
For the technologies of the extended mind framework, algorithmic processes can (over time) develop into an automated and
algorithmic coupling of mind (brain processes for cognitive, affective, and social functioning) and technology (e.g., smartphone
application). When a person first begins utilizing a new algorithmic device, there is a phase in which the person depends on
controlled processes to inhibit and override automatic processes initiated by the device (see reflective and algorithmic control of
technology in Fig. 1). After using the technology for a period of time the operations become overlearned and more or less
automatic.
Inga’s trip to the museum can be updated using a GPS as a technology of the extended mind. Once at the museum, Inga is given
a museum app for her smartphone that is integrated with her GPS. Inga is told that she can search for museum exhibits with the
mobile app and it will communicate the best route to exhibits. After arriving at an exhibit, Inga can use the augmented reality-
enabled mobile app to discover facts about each exhibit from a virtual docent. This virtual docent application is especially advan-
tageous as it permits Inga to navigate without getting lost, as many of the exhibits are in areas of the museum with which she was
unaccustomed. At first Inga is somewhat dubious of the mobile app as she is not very familiar with such technologies. As a self-
professed luddite, Inga seldom utilizes her smartphone for activities other than conversing. As a result, Inga remains alert (see
controlled/reflective processing in Fig. 1) to her surroundings so that she can ensure making it to the museum exhibits without
problems.
Eventually, Inga starts to trust the smartphone application and rarely stops herself from automatically following the application’s
guidance (see inhibition and override of technology using reflective and algorithmic control of technology in Fig. 1). Is Inga’s
mobile app and virtual docent functioning as a technology of the extended mind? One can argue that it is performing computations
external to Inga’s brain. However, Inga’s smartphone application (with the virtual docent) is probably better deemed a cognitive
aide because neither the mobile application calculations nor Inga’s use of them are automated with Inga’s cognitive processes
(see algorithmic control of technology in Fig. 1). After a few months of using the smartphone application several times, Inga decides
to go to a different museum that also uses the same mobile app and virtual docent. The smartphone application never failed her in
its directions to exhibits or its information (e.g., artist, history, subtleties of the work) about the art at each exhibit. At the new
museum, she casually searches for exhibits using the smartphone application’s search interface and when the route is presented
on the smartphone screen, she automatically follows it to the destination and promptly listens to the virtual docent tell her about
the art. The smartphone application is starting to operate as a technology of the extended mind as Inga is coupled with the algo-
rithmic device.
Extended cognition has been applied to psychotherapy (Shennan, 2016) and mental health services for patients with borderline
personality disorder (Bray, 2008); attention deficit/hyperactivity disorder, autism (Sneddon, 2002); social anxiety disorder (Carter
and Palermos, 2016); dispositional affective states (Colombetti and Roberts, 2015); sexual dysfunction (Merritt, 2013); sex

Figure 1 Technologies of the extended mind framework.


8 Ethics in Technology for Clinical Psychology

offenders (Ward, 2009; Ward and Casey, 2010); dementia (Clark and Chalmers, 1998; Clark, 2008; Drayson and Clark, 2020;
Nelson, 2009; Wilson and Lenart, 2015); psychopathology (Drayson, 2009; Sneddon, 2002) and depression (Hoffman, 2016).

Ethical Considerations for Technologies Extending Cognition

If technology extends minds into the external world, should we apply the same ethical considerations that govern our everyday lives
to anything that results in extended mind loops? A prospective ethical consideration for Inga’s smartphone application and virtual
docent that extends her cognitive processes beyond her brain is that Inga’s initial use of the smartphone application involved vigi-
lant attention (see reflective control of technology in Fig. 1) to both the application and her surroundings to make sure that she
could trust the application (and not get lost). Again, during this controlled conscious processing, the smartphone application
was not functioning as a computational assistant instead of a technology of the extended mind (see algorithmic control of tech-
nology in Fig. 1).
After using the smartphone application for a few weeks, Inga has assimilated the smartphone applications algorithmic process-
ing into the working of her mind (see automatic algorithmic processing by technology in Fig. 1) while navigating both museums.
Inga is an art appraiser and uses the application as she works on an assignment in the museum. The assignment requires Inga to
travel to a new area of the museum to appraise some new items. Prior to her lunch break she follows the smartphone’s GPS to
the appraisal area. On her way, she receives notifications from the smartphone as she passes a sign advertising the museum’s constel-
lation of eateries; and alerts chime again when the museum’s eateries are just up ahead. An ethical issue here is that the algorithms
have learned Inga’s preferences and are attempting to influence her actions. Moreover, the smartphone algorithm may intensify its
degree of suggestion by “questioning” whether Inga would like to get something to eat. Inga needs to complete her assignment
(continue to the appraisal area), but reasons that little detriment would emanate from stopping to get something to eat. The tech-
nology is clearly influencing Inga and causing an alteration of her plan to complete her assignment. This may be a relatively trivial
example of undue influence, but it still reflects an autonomy violation. When one considers the fact that the very same algorithm
that has become an extension of Inga’s mind was designed by a corporate entity that may be paid by vendors at Museum Cafés for
directing users to them. Likewise an eTherapy app may include suggestions for various products and services. These potential
conflicts of interest sully the ethical landscape when endeavoring to determine the magnitude of an algorithms capacity for violating
of autonomy.

Future Research Directions

Clinical psychologists can benefit from clinical neuroscientific methodologies that are succeeding traditional psychological theories
emphasizing mental over biological treatments of mental illnesses (Parsons, 2015, 2017). For example, depression has received less
acceptance as a brain disorder by the public, or even among clinicians. This slow acceptance of mental disorders (like depression) as
biological may be that unlike traditional neurological illnesses (e.g., post stroke; Parkinson’s disease) with visible damage, the neu-
roimaging technology necessary for identifying mental disorders (like depression) has not been available. Progress in neuroimaging
technologies are enhancing capabilities for mapping brain functions in depression. These technologies allow for the detection of
activity glitches of specific brain areas and/or disrupted communications among brain regions that function together as circuits
to perform normal mental operations. Malfunctioning neural circuits malfunction may underlie many mental disorders.
Depression offers an example. Persons with depression are disposed to decreased energy, low mood, slowed reaction times,
memory difficulties, and disinhibition. This makes it look like some brain activity are underactive. Neuroimaging of areas impacted
by depression reveals imbalances Broadman area 25da hub for a depression circuit. Broadman area 25 has direct connections to
brain areas that mediate fear and anxiety (amygdala) and those involved in stress responses (hypothalamus). These brain regions
are connected to brain areas involved in memory processing (hippocampus) and processing of sensory perceptions and emotions.
Smaller than normal Broadman area 25 and a gene variant that inhibits serotonin processing increases risk of depression.
Researchers have targeted a number of additional brain areas involved in depression. Subsequently, several areas are now being
targeted for treating depression.
Broadman area 25 (subgenual cingulate cortex) is very active in depression. Successful treatments (medication, psychotherapy)
is related to decreased activity in this circuit. Moreover, Mayberg et al. (2005, 2009) have found that deep brain stimulation of the
subgenual cingulate cortex (i.e., Broadman area 25 depression circuit) reduces activity and can decrease depression in persons who
had treatment-refractory depression. Researchers have targeted a number of other brain areas involved in depression. It is indefinite
how deep brain stimulation diminishes depression. Some contend that the electrical pulses act to “reset” the malfunctioning brain
area. This returns brain circuits to normal functioning. These extended cognitive and affective circuits include both the wetware-
based brain circuit and the hardware-based electrodes and battery. Together, they extend the patient’s cognitive and affective
processes.
What are the ethical implications for clinical psychologist working with patients receiving deep brain stimulation? First, there are
the potential side effects. Deep brain stimulation shares risks found with other stereotactic neurosurgical procedures: intracranial
bleeding, hardware-related complications (e.g., dislocation, lead fracture, and infection). Moreover, there are stimulation-
induced side effects that are relative to stimulation electrode location. These include mania, depression, laughter, penile erection,
Ethics in Technology for Clinical Psychology 9

and aggression. Deep brain stimulation is an invasive procedure and ethicists have considered the ethical implications of the poten-
tial side-effects (Clausen, 2010; Schermer, 2010). Furthermore, the financial expenses of deep brain stimulation have resulted in
questions of its cost-effectiveness (McIntosh, 2011).
The risks and benefits associated with deep brain stimulation must be balanced to maintain respect for the patient and the auton-
omous desires of the patient. The principle of nonmaleficence calls for minimization of the risks and potential side effects (physical
and psychological) of deep brain stimulation. It also entails an evaluation of the possible effects of deep brain stimulation on
patient identity and brain development. The principle of beneficence calls for augmenting deep brain stimulation treatment
both throughout surgery and ensuing psychosocial maintenance. The principle of justice involves the optimum regulation and
prioritizing of deep brain stimulation treatment. There is also the principle of autonomy and respecting patient’s well-informed
choice. Autonomy involves informed and necessary competence to consent with supplementary questions when patients are chil-
dren. Autonomy also entails best practices for handling impractical expectations and patient desperation. Consideration of patient
counsel regarding deep brain stimulation also includes the principles of subsidiarity, “select minimal burden option,” and propor-
tionality of risks and benefits. These are significant for safeguarding optimum patient selection for deep brain stimulation.

Clinical Applications and Recommendations

The extended mind approach can be applied to psychiatric practice. Hoffman (2016) argues that the extended mind approach may
have significant benefits for diagnostic decision making, interventions, shifting research priorities, and the ways in which patients
with psychiatric symptoms consider themselves. Hoffman presents Iain and Emmadtwo individuals with depression and dimin-
ished feelings of self-worth. Iain regularly experiences uncertainty about his self-worth. Iain checks his internal brain-based memory
of a list he developed years before that praizes his virtues and underscores his worth. Likewise, Emma experiences consistent uncer-
tainty about her worth. As a substitute for an internalized list in her brain, Emma reads an old letter that she had written years ago
that emphasizes her virtues and worth. The letter is kept religiously by her side. Emma’s consultation of the letter supports her
virtues and feeling of worth. Iain and Emma have resources attesting to their worth.

Telepsychology and eTherapy


This idea can be useful for considering digital technologies that deliver pervasive Internet connectivity. Smartphone technologies
afford global, cost-effective and evidence-based mental health services on demand and in real time (Aboujaoude et al., 2015; Firth
et al., 2017). Imagine Otto is experiencing depression related to his Alzheimer’s disease. Otto, takes part in eTherapy to deal with
challenging life setbacks. In between face-to-face video teleconferencing sessions with a clinical psychologist, he uses a smartphone
application. During eTherapy, his clinical psychologist focuses on replacing depressed cognitions with non-depressed (healthy)
cognitions aimed at enhancing feelings of self-worth. These healthy cognitions associated with his self-worth are logged and stored
in a database. Also logged and stored are voice recordings from his eTherapy sessions that are analyzed using speech emotion recog-
nition (El Ayadi et al., 2011; Wu et al., 2010a,b; Zeng et al., 2009). In between sessions, Otto consults a smartphone application that
links to the database and delivers Otto suggestions, inspirational notifications, and support for his self-worth. Otto’s clinical
psychologist uses a back-end system to send short text messages to Otto via a messaging system, similar to a Short Message Service
(SMS). The messaging system allows the clinical psychologist to deliver personalized messages of encouragement as well as weekly
general educational messages. When Otto receives the messages, he inputs a rating into the smartphone application and the algo-
rithmic device learns which messages work best for Otto. The smartphone application monitors Otto’s smartphone usage patterns
to classify mood-based metrics (number of incoming/outgoing calls; duration of incoming/outgoing calls; outgoing text messages,
application usage; LiKamWa et al., 2013; Faurholt-Jepsen et al., 2016). Otto also wears a wristband that includes physiological
sensors (such as heart rate, breathing, skin conduction, physical activity) and allows the application to access them. This allows
for identification of Otto’s arousal (categorized as emotion states) using physiological signals (Calvo and D’Mello, 2010; Jerritta
et al., 2011; Sun et al., 2010). After Otto has used the system for a period of time, the algorithmic device develops algorithms
that automate the messages from the database.
Clinical psychologists considering the ethical challenges of Otto’s eTherapy and smartphone applications will want to make sure
that privacy is maintained. This would include ensuring that all Internet (including the clinical psychologist’s back-end system) and
smartphone activities (including Otto’s smartphone application) are secured by means of Secure Sockets Layer (SSL) encrypted
information. Otto’s extended cognitive and affective processes coupled with the smartphone application and the algorithms call
for further attention. Although the smartphone application may aide in restoring and maintaining Otto’s self-worth, this may sacri-
fice some of his autonomy. Otto’s smartphone application use may result in a reliance, comparable to gambling, that could interfere
with his activities of daily living. Internet addiction is not an official diagnosis, but studies have reported addiction symptomology
in persons who over use smartphones. This includes distorted perceptions of time spent on the smartphone, smartphone preoccu-
pation, and withdrawal (e.g. Kwon et al., 2013; Lanaj et al., 2014; Lin et al., 2015). Smartphones with passive and active data
analytics described above may result in personalized algorithms aimed at helping Otto feel better. Furthermore, the smartphone
application may increase vulnerability to other smartphone applications and social media platforms designed to compel users
to check for message notifications. Otto may find himself tapping and scrolling for hours. Designers of social media applications
10 Ethics in Technology for Clinical Psychology

often include subtle psychological reinforcements that may result in habit formation. The intermittent reinforcements (e.g., chimes,
emoji’s, notifications) Otto receives may result in a craving for additional Internet use.

Neuroethics of Enhancements
A significant distinction between therapy and enhancement needs to be kept in mind when considering neuroethical use of tech-
nologies. This distinction is especially useful in deliberations about the appropriate and inappropriate uses of various technologies.
While eTherapy involves treatment of problematic cognitions, affects, and/or behaviors, enhancement usually entails augmenting
or enriching non-clinical aspects to a status that is better than normal. Following the Presidential Commission for the Study of
Bioethical Issues (2014) therapeutic use of technologies involves treating patients with known diseases, disabilities, or impairments,
in an effort to restore them to a normal state of health and fitness. Using technologies for enhancement entails augmenting the
“normal” bodily and cognitive processes, to enhance native capacities and performances.
As technologies move from therapeutic interventions to enhancement ethical concerns will arise related to the provision of
mental health services, the meaning of “natural,” human dignity, and personhood. Those supporting of enhancement view drawing
a strict line between therapy and enhancement as untenable (Bostrom and Savulescu, 2009; Harris, 2010). This perspective is
bolstered by the equivocal borders demarcating these conceptions (Lin and Allhoff, 2008). Those that are more in the bio-
conservative camp argue that therapy is ethically suitable and enhancement is ethically challenging (Buchanan et al., 2001; Daniels,
2008). Those attracted to transhuman coupling with technologies propose that even if a differentiation between therapy and
enhancement is established, there is no ethical difference between them (Bostrom, 2008; Bostrom and Savulescu, 2009; Harris,
2010).
Advances in neurotechnologies are apparent in the increased use (both therapeutically, and for enhancement) of methodologies
like transcranial magnetic stimulation, brain-computer interfaces, brain implants, and genetic engineering (Bostrom and Sandberg,
2006, 2009). These human computer interfaces extend from direct interactions between neural tissue and electronic transducers to
indirect interactions based on scalp level electrical signals. Direct approaches (e.g., deep brain stimulation) are neurosurgical treat-
ments for various neurological (Parkinson’s, essential trimmer, epilepsy) and psychiatric issues (depression, obsessive compulsive
disorder, posttraumatic stress disorder). There are also non-invasive approaches including transcranial magnetic stimulation of tar-
geted brain areas for therapy (depression and other psychopathology; Bermudes et al., 2017), transcranial direct current stimula-
tion, brain computer interfaces, and cognitive enhancement (Luber and Lisanby, 2014).
A good deal of the ethical consideration in relation to neuroenhancement has been around pharmacological enhancement.
Farah et al. (2004) explained that “In contrast to the other neurotechnologies mentioned earlier, whose potential use for enhance-
ment is still hypothetical, pharmacological enhancement has already begun” (p. 421). This has resulted in considerable ethical
discussion being dedicated to ethical aspects of psychopharmaceuticals (Turner and Sahakian, 2006; Lynch et al., 2011). Pharma-
ceutical enhancement of normal cognition is noticeable across the lifespan. Stimulants are often used as study aids for high school
and college students who do not have ADHD.
Pharmaceutical enhancement arguments are generalizable to other enhancements like brain computer interfaces, transcranial
direct current stimulation (tDCS), and related off the shelf enhancers. Moreover, there are several consumer grade brain-
computer interfaces that can be used for enhancing mood (2 channel Muse; 1 channel Neurosky; 14 channel eMotive), cognitive
processing; (14 channel eMotive); athletic performance (Versus) and general brain monitoring (16 channel OpenBCI; 14 channel
eMotive). There are also passive enhancements coupling persons automatically and algorithmically to devices.
Ethical concerns for enhancers include both academic and industrial development of enhancers, as well as the mental health
service providers who act as gatekeepers to them. Additionally, the users (sometimes a parent or caregiver) must decide on whether
or not to use enhancers. As enhancers increase in availability, employers and educators will face ethical dilemmas in the manage-
ment and evaluation of those who might be unenhanced, enhanced, or over enhanced. Furthermore, regulatory agencies may need
to be consulted when considering “lifestyle” risks and benefits.
Considerations of ethical enhancement challenges and potential societal responses (see Farah et al., 2004) include the following
issues: safety, coercion, distributive justice, and personhood. Enhancements comprise multifaceted interventions that may be
unsafe. For example, memory enhancers may interfere with one’s ability to comprehend the learned information. Enhanced
memory does not guarantee capacities for relating recalled information to new learning or other knowledge. Enhancers may coerce
persons into feeling compelled to enhance their cognitive capacities. Inga may be a determined student studying at a competitive
academic program and she aims for a career in a highly competitive job with equally high demands. Academic and employer
demands for enhanced attention and capacities for comprehending curricula may introduce explicit coercion. This may be com-
pounded by her desire to be highly competitive with her peers. This implicit coercion exerts an incentive for Sophia to use.
Enhancers will likely not be justly distributed. Some people will not be able to afford enhancers, which adds to disadvantages
already faced by people of low socioeconomic status. Unequal access is an unlikely justification for excluding enhancements,
any more than it is justification for excluding private tutoring or cosmetic surgery. Finally, there are concerns that enhancements
may impact personhood. This may span direct enhancements (e.g., psychopharmaceuticals, transcranial magnetic stimulation,
or deep brain stimulation) or indirect enhancements (e.g., smartphone applications, navigation systems, schedule reminders, social
media, notes, automated logging of data, automated calculations, and computing) that may interfere with a user’s identity and capa-
bilities. At what point in enhancement does a personality dissipate? For some, self-transformation enhancements can be viewed as
self-actualizing. For others, enhancers may threaten personal identity.
Ethics in Technology for Clinical Psychology 11

Conclusion

This chapter reviewed existing regulatory structures that clinical psychologists should consider when using technologies: HIPAA;
HITECH Act (2009); and the FERPA (or Buckley Amendment) Act. Clinical psychologists are encouraged to learn and follow Amer-
ican Psychological Association guidelines, codes, and licensure jurisdiction of use. In addition to attending technology-oriented
workshops and continuing education programs, clinical psychologists will at times need to consult with attorneys who specialize
in healthcare policy and privacy. Lastly, clinical psychologists need to be able communicate legal protections for patients and
participants.
The chapter discussed four principles established by Beauchamp and Childress (2001): autonomy; beneficence; nonmaleficence;
and justice. For clinical psychologists emphasis is often placed upon the American Psychological Association’s Ethical Principles of
Psychologists and Code of Conduct (2002) that includes the following five principles: beneficence and non-maleficence; fidelity
and responsibility; integrity; justice; and 5) respect for patient’s rights and dignity. These principles offer applicable standards for
the use of technologies to provide of mental health services. These guidelines underscore the necessary knowledge of the technical
facets of the technology for safeguarding their patients (e.g., privacy settings and encryption). The chapter also discussed boundary
uncertainties when using social media (e.g., following on Twitter; Facebook “friending”) and impractical expectations for email
communications.
Another area of consideration involved ethical implications of working with children, older-aged participants, and some clinical
populations. This is an area that clinical psychologists must be ethically vigilant when using technologies special populations. Of
primary import was informed consent and protection. It is crucial that work with persons from special populations be ethical, sensi-
tive, respectful, and protected. Research is needed to see how various clinical cohorts will react to virtual reality for both assessment
and interventions. For some patient cohorts VR may not add anything to the much less expensive face-to-face therapy (e.g., VRET for
combat stress symptoms).
Another framework introduced herein is related to the ways in which social media technologies and algorithmic devices (e.g.,
smartphones) have the potential to extend our cognitive, affective, and social processes beyond the wetware of our brains. An
important component for our understanding of the cognitive, affective, and social processes found in telepsychology is the notion
that technology is an extension of our cognitive processes. Smartphones aide our recall of information, comparison of new infor-
mation to old, calculation, navigation, and translationdthey allow us to access an abundance of information and guidance.
Our remarkable evolutionary success is less a factor of our large frontal lobes. Instead, it reflects our capacity for extending cogni-
tive processes into the environment with which we interact.
This idea is known as “extended mind” and characterizes human cognizing as comprising complex feedback (as well as feedfor-
ward and feed-around) loops among brain, body, and the peripheral world. If technology extends minds into the external world, we
should apply the same ethical considerations that govern our everyday lives to anything that results in extended mind loops.
The chapter also considered the ethical implications for clinical psychologist working with patients receiving deep brain stim-
ulation. These included potential side effects; risks found with other stereotactic neurosurgical procedures (intracranial bleeding),
hardware-related complications (e.g., dislocation, lead fracture, and infection). Furthermore, there are stimulation-induced side
effects that are relative to stimulation electrode location. Deep brain stimulation is an invasive procedure and ethicists have consid-
ered the ethical implications of the potential side-effects. The risks and benefits associated with deep brain stimulation must be
balanced to maintain respect for the patient and the autonomous desires of the patient.
The chapter also discussed ways in which the extended mind approach can be applied to psychiatric practice. This idea can be
useful for considering digital technologies that deliver pervasive Internet connectivity. Smartphone technologies afford global,
cost-effective and evidence-based mental health services on demand and in real time. Clinical psychologists considering the ethical
challenges of eTherapy and smartphone applications will want to make sure that privacy is maintained. A patient’s smartphone
application use may result in a reliance, comparable to gambling, that could interfere with activities of daily living. Internet addic-
tion is not an official diagnosis, but studies have reported addiction symptomology in persons who over use smartphones. This
includes distorted perceptions of time spent on the smartphone, smartphone preoccupation, and withdrawal. Smartphones with
passive and active data analytics described above may result in personalized algorithms aimed at helping the patient feel better.
Furthermore, the smartphone application may increase vulnerability to other smartphone applications and social media platforms
designed to compel users to check for message notifications.
The chapter also makes a distinction between therapy and enhancement. While eTherapy aims to treat problematic cognitions,
affects, and/or behaviors, technological enhancements often involve augmenting or enriching non-clinical aspects of the participant
to a status that is better than normal. Ethical concerns arize as technologies move from therapeutic interventions (e.g., eTherapy for
clinical populations) to nootropic enhancements (smart drugs for non-clinical participants). Of note are the ethical concerns will
arise related to the provision of mental health services, the meaning of “natural,” human dignity, and personhood.
In summary, this chapter discussed several issues relevant to clinical psychologist praxes in the digital era. These guidelines and
deliberations should be judiciously considered when using technologies in clinical research and practice. The chapter also discussed
the significance of appropriate technology use, pertinent legal and ethical issues, maintaining secure electronic communications,
and strategies for maintaining boundaries. Considerations needed before clinical psychologists start using technologies in practice
and research were discussed. Clinical psychologists in the digital age need comprehensive understanding of privacy standards, confi-
dentiality, and security. This chapter also went over ethical issues from a technologies of extended mind framework. Technologies
can extend the self via direct (deep brain stimulation; pharmaceuticals) and indirect (smartphone applications, navigation systems,
12 Ethics in Technology for Clinical Psychology

schedule reminders, social media, notes, automated logging of data, automated calculations, and computing) connections to our
neural networks. Concerns related to the therapy versus enhancement divide were considered.

References

Aboujaoude, E., Salame, W., Naim, L., 2015. Telemental health: a status update. World Psychiatry 14 (2), 223–230.
Allied Control Council, 1949. Trials of War Criminals Before the Nuernberg Military Tribunals Under Control Council Law No. 10. US Government Printing Office, Washington, DC.
American Counselling Association, 1999. Ethical Standards for Internet Online Counseling.
American Psychological Association, 2013a. Guidelines for the Practice of Telepsychology. Retrieved from: http://www.apa.org/practice/guidelines/telepsychology.aspx.
American Psychological Association, 2013b. Telepsychology 50-Statereview. Retrieved from: http://www.apapracticecentral.org/advocacy/state/telehealth-slides.pdf.
American Psychological Association, 2017. Ethical Principles of Psychologists and Code of Conduct. Retrieved from: https://www.apa.org/ethics/code/ethics-code-2017.pdf.
American Psychiatric Association, 2013c. Diagnostic and Statistical Manual of Mental Disorders, fifth ed. Washington, DC: Author.
American Psychological Association, 2002. Ethical principles of psychologists and code of conduct. Am. Psychol. 57 (12), 1060–1073.
American Psychological Association, 2010. American Psychological Association Ethical Principles of Psychologists and Code of Conduct. Retrieved from: http://www.apa.org/ethics/
code/.
American Telemedicine Association, 2009. Practice Guidelines for Videoconferencing-Based Telemental Health. Retrieved from: http://www.americantelemed.org/files/public/
standards/PracticeGuidelinesforVideoconferencing-Based%20TelementalHealth.pdf.
Aardema, F., O’Connor, K., Côté, S., Taillon, A., 2010. Virtual reality induces dissociation and lowers sense of presence in objective reality. Cyberpsychol. Behav. Soc. Netw. 13 (4),
429–435.
Bauer, R.M., Iverson, G.L., Cernich, A.N., Binder, L.M., Ruff, R.M., Naugle, R.I., 2012. Computerized neuropsychological assessment devices: joint position paper of the American
academy of clinical neuropsychology and the national academy of neuropsychology. Arch. Clin. Neuropsychol. 27 (3), 362–373.
Beauchamp, T.L., Childress, J.F., 2001. Principles of Biomedical Ethics. Oxford University Press, USA.
Bechara, A., Damasio, A.R., 2005. The somatic marker hypothesis: a neural theory of economic decision. Game. Econ. Behav. 52, 336–372.
Bermudes, R.A., Lanocha, K.I., Janicak, P.G. (Eds.), 2017. Transcranial Magnetic Stimulation: Clinical Applications for Psychiatric Practice. American Psychiatric Pub.
Bostrom, N., 2008. Drugs can be used to treat more than disease. Nature 451 (7178), 520.
Bostrom, N., Sandberg, A., 2006. Converging cognitive enhancements. In: Bainbridge, W.S., Roco, M.C. (Eds.), V Progress in Convergence: Technologies for Human Wellbeing, Ur,
pp. 201–227. New York.
Bostrom, N., Sandberg, A., 2009. Cognitive enhancement: methods, ethics, regulatory challenges. Sci. Eng. Ethics 15 (3), 311–341.
Bostrom, N., Savulescu, J., 2009. Human enhancement ethics: the state of the debate. In: Savulescu, J., Bostrom, N., Human Enhancement (Eds.), Human Enhancement. (pp. 1-
22). Oxford University Press, Oxford.
Bray, A., 2008. The extended mind and borderline personality disorder. Australas. Psychiatry 16, 8–12.
Buchanan, A., Brock, D., Daniels, N., Wikler, D., 2001. From Chance to Choice. Cambridge University Press.
Calvo, R.A., D’Mello, S., 2010. Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1 (1), 18–37.
Campbell, L.F., Millán, F.A., Martin, J.N. (Eds.), 2018. A Telepsychology Casebook: Using Technology Ethically and Effectively in Your Professional Practice. American Psychological
Association.
Carter, J.A., Palermos, S.O., 2016. Is having your computer compromised a personal assault? The ethics of extended cognition. J. Am. Phil. Assoc. 2 (4), 542–560.
Cernich, A.N., Brennana, D.M., Barker, L.M., Bleiberg, J., 2007. Sources of error in computerized neuropsychological assessment. Arch. Clin. Neuropsychol. 22, 39–48.
Chalmers, D., 2008. Foreword to Andy Clark’s Supersizing the mind. A. Clark, Supersizing the mind: Embodiment, action, and cognitive extension. ix–xvi.
Clark, A., Chalmers, D., 1998. The extended mind. Analysis 58 (1), 7–19.
Clark, A., 2008. Supersizing the Mind: Embodiment, Action, and Cognitive Extension. Oxford University Press, USA.
Clausen, J., 2010. Ethical brain stimulation–neuroethics of deep brain stimulation in research and clinical practice. Eur. J. Neurosci. 32 (7), 1152–1162.
Colombetti, G., Roberts, T., 2015. Extending the extended mind: the case for extended affectivity. Phil. Stud. 172, 1243–1263.
Daniels, N., 2008. Just Health. Cambridge University Press, New York.
Dennett, D.C., 1996. Kinds of Minds. Basic Books, New York, NY.
Department of Health, E., 2014. The Belmont Report. Ethical principles and guidelines for the protection of human subjects of research. J. Am. Coll. Dent. 81 (3), 4.
Drayson, Z., 2009. Embodied cognitive science and its implications for psychopathology. Philos. Psychiatry Psychol. 16, 329–340.
Drayson, Z., Clark, A., 2020. Cognitive Disability and Embodied, Extended Minds. In: Wasserman, D., Cureton, A. (eds.) The Oxford Handbook of Philosophy and Disability. (pp. 580–
597), Oxford University Press
El Ayadi, M., Kamel, M.S., Karray, F., 2011. Survey on speech emotion recognition: features, classification schemes, and databases. Pattern Recogn. 44 (3), 572–587.
Elhai, J.D., Hall, B.J., 2016. How secure is mental health providers’ electronic patient communication? An empirical investigation. Prof. Psychol. Res. Pract. 46 (6), 444.
Facebook, 2014. Introducing Graph Search. Retrieved from: https://www.facebook.com/about/graphsearch.
Family Educational Rights and Privacy Act. Pub L No. 93-380, 1974.
Farah, M.J., Illes, J., Cook-Deegan, R., Gardner, H., Kandel, E., King, P., et al., 2004. Neurocognitive enhancement: what can we do and what should we do? Nat. Rev. Neurosci. 5
(5), 421–425.
Faurholt-Jepsen, M., Vinberg, M., Frost, M., Debel, S., Margrethe Christensen, E., Bardram, J.E., Kessing, L.V., 2016. Behavioral activities collected through smartphones and the
association with illness activity in bipolar disorder. Int. J. Methods Psychiatr. Res. 25 (4), 309–323.
Firth, J., Torous, J., Nicholas, J., Carney, R., Pratap, A., Rosenbaum, S., Sarris, J., 2017. The efficacy of smartphone-based mental health interventions for depressive symptoms:
a meta-analysis of randomized controlled trials. World Psychiatry 16 (3), 287–298.
Fitz, N.S., Reiner, P.B., 2016. Time to expand the mind. Nature 531, S9.
Force, J.T., Initiative, T., 2013. Security and privacy controls for federal information systems and organizations. NIST Special Publication 800 (53), 8–13.
Gamble, N., Morris, Z.A., 2014. Ethical and competent practice in the online age. In Psych 36, 18–19. Retrieved from: http://www.psychology.org.au/Content.aspx?ID¼5851.
Greene, J.D., Nystrom, L.E., Engell, A.D., Darley, J.M., Cohen, J.D., 2004. The neural bases of cognitive conflict and control in moral judgment. Neuron 44 (2), 389–400.
Greene, J.D., Morelli, S.A., Lowenberg, K., Nystrom, L.E., Cohen, J.D., 2008. Cognitive load selectively interferes with utilitarian moral judgment. Cognition 107, 1144–1154.
Gupta, K., Sinha, A., Bhola, P., 2016. Intersections between ethics and technology: online client–therapist interactions. In: Ethical Issues in Counselling and Psychotherapy Practice.
Springer, Singapore, pp. 169–186.
Harris, J., 2010. Enhancing Evolution: The Ethical Case for Making Better People. Princeton University Press.
Health Information Technology for Economic and Clinical Health Act, 2009, 1717 (2), 226–279.
Hoffman, G.A., 2016. Out of our skulls: how the extended mind thesis can extend psychiatry. Phil. Psychol. 29 (8), 1160–1174.
International Society of Mental Health Online, 2009. Suggested Principles for the Online Provision of Mental Health Services. Retrieved from: http://www.ismho.org/suggestions.asp.
Ethics in Technology for Clinical Psychology 13

Jerritta, S., Murugappan, M., Nagarajan, R., Wan, K., March, 2011. Physiological signals based human emotion recognition: a review. In: Signal Processing and its Applications
(CSPA), 2011 IEEE 7th International Colloquium on (pp. 410–415). IEEE.
Kolmes, K., 2012. Social media in the future of professional psychology. Prof. Psychol. Res. Pract. 43 (6), 606.
Kwon, M., Lee, J.Y., Won, W.Y., Park, J.W., Min, J.A., Hahn, C., Kim, D.J., 2013. Development and validation of a smartphone addiction scale (SAS). PloS one. 8 (2). e56936.
Lannin, D.G., Scott, N.A., 2013. Social networking ethics: developing best practices for the new small world. Prof. Psychol. Res. Pract. 44 (3), 135–141.
Lanaj, K., Johnson, R.E., Barnes, C.M., 2014. Beginning the workday yet already depleted? consequences of late-night smartphone use and sleep. Organ. Behav. Hum. Decis.
Process 124 (1), 11–23.
Levy, N., 2007. Rethinking neuroethics in the light of the extended mind thesis. Am. J. Bioeth. 7 (9), 3–11.
Lieberman, M.D., 2007. Social cognitive neuroscience: a review of core processes. Annu. Rev. Psychol. 58, 259–289.
LiKamWa, R., Liu, Y., Lane, N.D., Zhong, L., June, 2013. Moodscope: building a mood sensor from smartphone usage patterns. In: Proceeding of the 11th Annual International
Conference on Mobile Systems, Applications, and Services. ACM, pp. 389–402.
Lin, P., Allhoff, F., 2008. Untangling the debate: the ethics of human enhancement. NanoEthics 2 (3), 251.
Lin, Y.H., Lin, Y.C., Lee, Y.H., Lin, P.H., Lin, S.H., Chang, L.R., Kuo, T.B., 2015. Time distortion associated with smartphone addiction: identifying smartphone addiction via a mobile
application (App). J. Psychiatr. Res. 65, 139–145.
Luber, B., Lisanby, S.H., 2014. Enhancement of human cognitive performance using transcranial magnetic stimulation (TMS). Neuroimage 85, 961–970.
Lustgarten, S.D., 2015. Emerging ethical threats to client privacy in cloud communication and data storage. Prof. Psychol. Res. Pract. 46 (3), 154.
Lustgarten, S.D., Colbow, A.J., 2017. Ethical concerns for telemental health therapy amidst governmental surveillance. Am. Psychol. 72 (2), 159.
Lynch, G., Palmer, L.C., Gall, C.M., 2011. The likelihood of cognitive enhancement. Pharmacol. Biochem. Behav. 99 (2), 116–129.
Maheu, M.M., Pulier, M.L., Wilhelm, F.H., McMenamin, J.P., Brown-Connolly, N.E., 2005. The Mental Health Professional and the New Technologies: A Handbook for Practice
Today. Routledge.
Mayberg, H.S., Lozano, A.M., Voon, V., McNeely, H.E., Seminowicz, D., Hamani, C., Kennedy, S.H., 2005. Deep brain stimulation for treatment-resistant depression. Neuron 45 (5),
651–660.
Mayberg, H.S., 2009. Targeted modulation of neural circuits: a new treatment strategy for depression. J. Clin. Invest. 119 (4), 717–725.
McIntosh, E.S., 2011. Perspective on the economic evaluation of deep brain stimulation. Front. Integr. Neurosci. 5, 19.
Menary, R. (Ed.), 2010. The Extended Mind. MIT Press, Cambridge, Mass.
Merritt, M., 2013. Instituting impairment: extended cognition and the construction of female sexual dysfunction. Cognit. Syst. Res. 25–26, 47–53.
Milgram, S., 1963. Behavioral study of obedience. J. Abnorm. Psychol. 67 (4), 371–378
Nagel, S.K., Hrincu, V., Reiner, P.B. (2016, May 13–14). Algorithm Anxiety- Do Decision- Making Algorithms Pose a Threat to Autonomy? Presented at 2016 IEEE International
Symposium on Ethics in Engineering, Science and Technology, Vancouver, BC.
Nagel, S.K., Reiner, P.B., 2018. Skillful use of technologies of the extended mind illuminate practical paths towards an ethics of consciousness. Front. Psychol. 9, 1251.
National Board for Certified Counselors, 1997. Standards for the Ethical Practice of WebCounseling. Greensboro, NC. Retrieved from: http://www.nbcc.org/Assets/Ethics/
NBCCPolicyRegardingPracticeofDistanceCounselingBoard.pdf.
Navarrete, C.D., McDonald, M.M., Mott, M.L., Asher, B., 2012. Virtual morality: emotion and action in a simulated three-dimensional “trolley problem”. Emotion 12 (2), 364.
Nelson, J.L., 2009. Alzheimer’s disease and socially extended mentation. Metaphilosophy 40 (3–4), 462–474.
Norcross, J.C., Hedges, M., Prochaska, J.O., 2002. The face of 2010: a Delphi poll on the future of psychotherapy. Prof. Psychol. Res. Pract. 33 (3), 316–322.
Norcross, J.C., Pfund, R.A., Prochaska, J.O., 2013. Psychotherapy in 2022: a Delphi poll on its future. Prof. Psychol. Res. Pract. 44 (5), 363–370.
Office for Human Research Protections [OHRP], 1979. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Retrieved from:
http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html.
Pan, X., Slater, M., July, 2011. Confronting a moral dilemma in virtual reality: a pilot study. In: Proceedings of HCI 2011 the 25th BCS Conference on Human Computer Interaction,
vol. 25, pp. 46–51.
Parsons, T.D., 2015. Virtual reality for enhanced ecological validity and experimental control in the clinical, affective, and social neurosciences. Front. Hum. Neurosci. 1–19.
Parsons, T.D., 2016. Clinical Neuropsychology and Technology: What’s New and How We Can Use It. Springer Press, New York.
Parsons, T.D., 2017. Cyberpsychology and the Brain: The Interaction of Neuroscience and Affective Computing. Cambridge University Press, Cambridge.
Parsons, T.D., Kane, R.L., 2017. Computational neuropsychology: current and future prospects for interfacing neuropsychology and technology. In: Kane, R., Parsons, T.D. (Eds.),
The Role of Technology in Clinical Neuropsychology. Oxford University Press, pp. 471–482.
Parsons, T.D., McMahan, T., Kane, R., 2018a. Practice parameters facilitating adoption of advanced technologies for enhancing neuropsychological assessment paradigms. Clin.
Neuropsychol. 32 (1), 16–41.
Parsons, T.D., Lin, L., Cockerham, D. (Eds.), 2018b. Mind, Brain and Technology: Learning in the Age of Emerging Technologies. Springer.
Parsons, T.D., Duffield, T., 2019. National institutes of health initiatives for advancing scientific developments in clinical neuropsychology. Clin. Neuropsychol. 33, 246–270.
Parsons, T.D., 2019a. Ethical Challenges in Digital Psychology and Cyberpsychology. Cambridge University Press, Cambridge.
Parsons, T.D., 2019b. Neuroethics in educational technology: keeping the brain in mind when developing frameworks for ethical decision-making. In: Parsons, T., Lin, L.,
Cockerham, D. (Eds.), Mind, Brain, and Technology: How People Learn in the Age of New Technologies. Springer-Verlag, New York, pp. 195–210.
Patil, I., Cogoni, C., Zangrando, N., Chittaro, L., Silani, G., 2014. Affective basis of judgment-behavior discrepancy in virtual experiences of moral dilemmas. Soc. Neurosci. 9 (1),
94–107.
Presidential Commission for the Study of Bioethical Issues, 2014. Gray Matters: Integrative Approaches for Neuroscience, Ethics, and Society. In: Presidential Commission for the
Study of Bioethical Issues, vol. 1. http://www.bioethics.gov/sites/default/files/Gray%20Matters%20Vol%201.pdf.
Ragusea, A.S., VandeCreek, L., 2003. Suggestions for the ethical practice of online psychotherapy. Psychother Theory. Res. Pract. Train. vol. 40 (1–2), 94.
Reger, G.M., Koenen-Woods, P., Zetocha, K., Smolenski, D.J., Holloway, K.M., Rothbaum, B.O., et al., 2016. Randomized controlled trial of prolonged exposure using imaginal
exposure vs. virtual reality exposure in active duty soldiers with deployment-related posttraumatic stress disorder (PTSD). J. Consult. Clin. Psychol. 84 (11), 946.
Reiner, P.B., Nagel, S.K., 2017. Technologies of the extended mind: defining the issues. In: Illes, J., Hossain, S. (Eds.), Neuroethics: Anticipating the Future. Oxford University Press,
Oxford, pp. 108–122.
Sabin, J.E., Harland, J.C., 2017. Professional ethics for digital age psychiatry: boundaries, privacy, and communication. Curr. Psychiatr. Rep. 19 (9), 55.
Schermer, M., 2010. Ethical issues in deep brain stimulation. Front. Integr. Neurosci. 5, 17.
Shennan, G., 2016. Extended mind, extended person, extended therapy? InterAction J. Solut. Focus Organ. 8 (1), 7–30.
Skulmowski, A., Bunge, A., Kaspar, K., Pipa, G., 2014. Forced-choice decision-making in modified trolley dilemma situations: a virtual reality and eye tracking study. Front. Behav.
Neurosci. 8, 426.
Slater, M., Antley, A., Davison, A., Swapp, D., Guger, C., Barker, C., et al., 2006. A virtual reprise of the Stanley Milgram obedience experiments. PLoS One 1 (1), e39.
Smart, P.R., 2012. The Web-extended mind. Metaphilosophy. 43 (4), 446–463.
Sneddon, A., 2002. Towards externalist psychopathology. Phil. Psychol. 15, 297–316.
Stanovich, K.E., 2012. On the distinction between rationality and intelligence: implications for understanding individual differences in reasoning. In: The Oxford Handbook of Thinking
and Reasoning, pp. 343–365.
Sun, F.T., Kuo, C., Cheng, H.T., Buthpitiya, S., Collins, P., Griss, M., October, 2010. Activity-aware mental stress detection using physiological sensors. In: International Conference
on Mobile Computing, Applications, and Services. Springer, Berlin, Heidelberg, pp. 282–301.
14 Ethics in Technology for Clinical Psychology

Torous, J., Roberts, L.W., 2017. The ethical use of mobile health technology in clinical psychiatry. J. Nerv. Ment. Dis. 205 (1), 4–8.
Turner, D.C., Sahakian, B.J., 2006. Neuroethics of cognitive enhancements. BioSocieties 1 (1), 113–123.
United States. National Commission for the Protection of Human Subjects of Biomedical, & Behavioral Research. (1978). The Belmont report: ethical principles and guidelines for the
protection of human subjects of research (Vol. 2). The Commission.
Ward, T., Casey, A., 2010. Extending the mind into the world: a new theory of cognitive distortions in sex offenders. Aggress. Violent Behav. 15 (1), 49–58.
Ward, T., 2009. The extended mind theory of cognitive distortions in sex offenders. J. Sex. Aggress. 15 (3), 247–259.
Wilson, R., Lenart, B., 2015. Extended mind and identity. In: Clausen, J., Levy, N. (Eds.), Handbook of Neuroethics. Springer, New York, NY, pp. 423–439.
Woodhead, M., Faulkner, D., 2008. Subjects, objects or participants? Dilemmas of psychological research with children. In: Research with Children. Routledge, pp. 26–55.
World Medical Association, 1964. World Medical Association Declaration of Helsinki - Ethical Principles for Medical Research Involving Human Subjects. World Medical Association,
Ferney-Voltaire, France.
Wu, D., Parsons, T.D., Mower, E., Narayanan, S., July, 2010a. Speech emotion estimation in 3D space. In: Multimedia and Expo (ICME), 2010 IEEE International Conference on (pp.
737–742). IEEE.
Wu, D., Parsons, T.D., Narayanan, S.S., 2010b. Acoustic feature analysis in speech emotion primitives estimation. In: Proceedings of InterSpeech, Makuhari, Japan, September 26–
30, 2010.
Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S., 2009. A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell.
31 (1), 39–58.
Zur, O., 2008. The Google factor: therapists’ self-disclosure in the age of the Internet. Indepen. Pract. 28 (2), 83–85.

You might also like