Futures: Helen T. Sullivan, Shrirang Sahasrabudhe

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

G Model

JFTR 2145 No. of Pages 9

Futures xxx (2016) xxx–xxx

Contents lists available at ScienceDirect

Futures
journal homepage: www.elsevier.com/locate/futures

Envisioning inclusive futures: Technology-based assistive


sensory and action substitution
Helen T. Sullivana,* , Shrirang Sahasrabudheb
a
Rider University, Lawrenceville, NJ, United States
b
University of North Carolina, Greensboro, NC, United States

A R T I C L E I N F O A B S T R A C T

Article history:
Received 4 February 2016 Mobile devices, tablets, smart phones, and now wearable technologies have changed our
Received in revised form 8 June 2016 world, including how we perceive and interact with others and our environment. This
Accepted 21 June 2016 change has potential for both positive and negative outcomes, especially for people with
Available online xxx sensory, cognitive, and physical impairments. This work looks at some possible
eventualities, with a focus on how these technologies may change the future for people
Keywords: with disabilities.
Technology ã 2016 Elsevier Ltd. All rights reserved.
Disabilities
Sensory substitution
Inclusion
Accessibility

1. Introduction

One thing we can count on is that technology is in constant change; in the past the iterations appeared less rapidly than
today’s regular stream of new technological innovations in products and services available to consumers. For example, it took
75 years for television, from inception, to reach 50 million viewers. In contrast, a single app, Angry Birds, reached 50 million
users in 35 days (Aeppel, 2015). Though the comparison involves two vastly different technology landscapes, it nonetheless
signals that adoption rates of software-based applications can be extremely rapid. Moore’s Law describes ongoing
improvements in computational power and capacity, and consumer products embedding computing technology have made
the benefits tangible to the population at large. These changes have also influenced how we interact with technology, with a
shift to mobile devices that are more adaptable and personalized, that can be used anywhere and at any time, and away from
fixed location technologies such as traditional television and desktop computers (Bouwman & Van Der Duin, 2007). The
United Nation’s International Telecommunications Union (ITU, 2015) reports that as of 2015, there are 7.1 billion mobile
phone subscriptions in the world, which effectively reaches a parity of one phone per person on the planet, and more than
three times the number reported in 2005. Smartphones are now in the hands of approximately 2 billion individuals, and that
number is expected to continue to grow, as users shift away from the current, globally dominant feature phone. Thriving app
stores for the major smartphone platforms offer consumers broad options for social media, education gaming,
entertainment, news, and productivity. While there has been past disagreement, many now suggest the smartphone
may be the only computer an individual will ever need (Bonnington, 2015; Economist, 2015), and for the developing world,
this will be especially true. Much as some countries skipped copper wire, dedicated land line telephony infrastructures and

* Corresponding author.
E-mail address: hsullivan@rider.edu (H.T. Sullivan).

http://dx.doi.org/10.1016/j.futures.2016.06.006
0016-3287/ã 2016 Elsevier Ltd. All rights reserved.

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

2 H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx

jumped directly to mobile phone networks, many [4_TD$IF]individuals now have the opportunity to skip the PC entirely and have as
their first computer the smartphone, in capability, surpassing the power of a high end PC from just 10 years ago.
With the increased portability and personalization these devices offer the consumer have also come other fundamental
changes. In many cases, end user products have reached a point where continual improvement of software-based features
can occur on a daily basis without user intervention, the notion of “evergreen” web browsers, for example, that are always up
to date with the latest standards (Techopedia, 2015). Rather than actively purchasing new products to upgrade one’s
technology, potentially a cumbersome process, software operating systems, apps, and web browsers can now “silently”
receive updates to patch security vulnerabilities, for example, without the need for user intervention (Duebendorfer & Frei,
2009). Even modern automobiles, such as the Tesla, receive automatic software updates (Greengard, 2015).
In the area of assistive technologies for those with disabilities, products such as Apple’s iPhone regularly receive updates
that may fix accessibility bugs or add new features, though the “silent” model of updating has lagged for these users; some
user intervention may still be needed to download and install the update. But even with the potential inconvenience posed
by the user intervention, the fact that most operating system software and many application updates arrive at no further cost
to the user is an improvement when compared to maintenance and upgrade fees charged by some assistive technology
vendors on the traditional PC platform.
Thus technological change is moving us from one requiring active adoption of new products or capabilities to one where
change is a given, even expected as an underlying baseline feature, as is the potential expectation that there will be constant
improvement in the technologies we use. Will this model uniformly flow through to the world of assistive technologies? And
more importantly as assistive features become more personal and personalized, what will be the impact for individuals with
disabilities? [5_TD$IF]While this model, and resultant expectations, are a positive development, there is evidence of some products
becoming “orphaned” when a vendor ends support, often without the end user being made aware (Miller, 2013). It will be
important to understand the risk of orphaning and how product futures are communicated to end users.
We can gauge potential changes and their impact on individuals in the future by recognizing key transitions from one
technology to another and how those transitions bring, somewhere on a continuum, either success through significant
change that improves quality of life, through to what might be considered a failure by cementing of a status quo, or worse,
reduction in a capability. When a technology is successful, competition can arise with the development and availability of
similar products with similar, but not necessarily equal capabilities. As assistive technologies or capabilities become
ubiquitous, consumers will have ready access to off the shelf devices that can support their specific requirements. For many,
access can then become self directed rather than mediated through assistive technology specialists. Building reliance upon
consumer technologies to solve access challenges (physical/sensory), while providing a new level of independence, may in
fact result in a user becoming dependent and less likely to receive what might be termed expert advice and guidance on what
works best at the moment, and in future iterations of technology. With an emerging vendor diversity of similar products,
there will come opportunities for failure to meet the needs of the user, either through intentional restriction of capabilities,
restricted access to updated features, introduction of software bugs or security vulnerabilities, or through orphaning of
products (Degusta, 2011).

2. Enabling access

There are several key technical developments, becoming ubiquitous, that are enabling a range of new assistive technology
features within mainstream products, such as smartphones. To understand the complete picture of how individuals with
disabilities are impacted, we will take a brief look at these developments.
First, let’s put these developments in the perspective of assistive technologies. By definition ([6_TD$IF]ATIA, 2015):
“Assistive technology (often abbreviated as AT) is any item, piece of equipment, software or product system that is used to
increase, maintain, or improve the functional capabilities of individuals with disabilities.”

Smartphones, for example, provide a variety of methods by which functional capabilities of individuals can be increased,
maintained, or improved. Accessibility settings are now common across a range of products from different vendors (see
Fig. 1). Built in features now include screen readers, screen magnification, color contrast adjustments, speech recognition,
and switch access. While these features owe their heritage and technical foundations to assistive technology originally
pioneered for personal computers in the 1980’s and 1990’s, the design of smartphones has led to innovations, such as the use
of touch gestures to control a screen reader. Beyond built-in assistive features, low cost and sometimes free third-party apps
provide functions including camera-based magnification, augmentative communication, and specialized tools for a range of
disabilities. Even televisions include built-in accessibility, moving beyond the already required closed captioning and
secondary audio, with screen reading capabilities and control through spoken commands.
Beyond these explicit assistive features, the devices themselves contain important enabling technologies including
multiple cameras, microphones, accelerometers, compass, GPS, light and proximity sensors, barometers, vibratory (haptic)
feedback, Bluetooth connectivity, WiFi, and Near Field Communications (NFC). Combining any one of these core features
with software opens the door to a range of fascinating new opportunities and ways to assist individuals with disabilities. For

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx 3


[(Fig._1)TD$IG]

Fig. 1. Three screen captures of the Accessibility settings from an Apple iPhone, Android tablet, and Windows Phone (Ease of Access).

examples, many apps are available to measure pulse rate by placing a finger across the device’s camera and LED flash to
create a software driven photoplethysmograph. Philips Corporation has created an app1 that measures both heart rate and
respiration using the device camera to detect minute changes in skin coloration and the rise and fall of the user’s shoulders.
While devices may have built-in assistive capabilities, it is through wireless interfaces such as Bluetooth that enable
connectivity to a range of devices and sensors that can facilitate or provide additional capabilities or enable interaction with
other products or services. With the rich platform capabilities typical of mobile devices and smartphones, the stage has been
set for a new realm of assistive technology solutions, commercially available in mainstream products and in conjunction
with specialized or off the shelf hardware providing specific assistive solutions. In the following sections we will explore how
these devices and capabilities will impact individuals with disabilities in the future.

3. Sensory and action substitution

As described previously, assistive technologies serve to increase, maintain, or improve function that may be impaired by a
disability. We will begin our exploration of technological futures for assistive technologies with what we describe as Sensory
and Action Substitution. While the concept of sensory substitution (or transformation) is not new (e.g. Bach-y-Rita, Collins,
Saunders, White, & Scadden, 1969), it has the potential to become far more prevalent as a result of new, ubiquitous
technologies, advanced further by increasing computational power in mobile devices and further augmented by crowd or
cloud-based resources. It is not so much substitution as technological transformation, translation, or augmentation. But it
can go one step further if we think about both the sensory process to assimilate environmental information and the physical
actions needed to interact with our environment. This can become not only sensory substitution but also “action
substitution," important for those who have mobility/dexterity limitations.
To better understand the technological underpinnings of the substitution model we will next focus on changes which can
be expected for people with specific disabilities through the use of the enabling technologies. We will begin with an
examination of how sensory substitution can address the challenges experienced by people who are deaf or hard of hearing.
One of the challenges faced by people with a hearing impairment can be an inability to detect events in their environment
cued by sounds. The hearing aid is an existing technology that may enable reception of these cues for people with some
hearing, but now with the growing ubiquity of smart phones, tablets and apps that can perform intelligent signal processing,
new functions can be performed, such as signaling the occurrence of environmental sounds which may not otherwise be
detected. For example, a sound monitoring app on a smartphone with intelligent auditory pattern matching and robust
speech recognition could detect alarm signals or spoken communication, and then signal it to the user with a vibration cue
and an on screen textual or ASL description. A further development, the emerging ubiquity of microphone arrays, such as
found in the Microsoft Kinect and the Amazon Echo, enable localizing the source of the sound spatially and could provide

1
http://www.vitalsignscamera.com/.

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

4 H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx

directional information to the user (e.g., “there is the sound of breaking glass behind you”) [7_TD$IF](Jain, et al., 2015). Researchers
have already used products such as Kinect to explore personal safety applications, such as to identify the sound of someone
falling in the home, a major concern for the elderly living alone (Li, Banerjee, Popescu, & Skubic, 2013).
Vision substitution is already well established through text to speech capabilities on mobile devices, providing spoken
presentation of textual information on displays and in apps (i.e., via a screen reader application such as VoiceOver). The
presence of cameras on mobile devices have further opened the door to real time (or near real time) transformation of
information from the visual environment of the user. Apps are available to identify colors (useful when selecting clothing to
wear), paper currency (to distinguish a $1 bill from an equally sized $20), and printed text (whether on a page of a book or on
a sign). The common GPS capability in a smart phone can transform the user’s location into spoken form and provide
directions. One of the better known sensory substitution apps is the vOICe2 which uses a mobile phone’s camera to create an
auditory soundscape of the field of view, enabling some users to navigate a built environment.
Thus a key development is the addition of software applications and hardware to devices such as smartphones that can
augment or replace human senses via sensory substitution. Another change that will influence how substitutions may be
overlaid on or blended with functional senses (e.g., vision for those who cannot hear and 3D spatial audio for those who
cannot see), is the rapidly developing field of augmented reality (see, for example, Van Krevelen & Poelman, 2010). Consumer
grade augmented (and virtual) reality products are coming on the market and seeing practical use today in entertainment,
education, medicine and other fields. These include devices as Google’s Glass and Cardboard, Microsoft’s Hololens, Oculus
Rift, and apps such as Google Ingress, Snapshop, and Yelp Monocle. The use of augmented reality will enable salient
information and cues, garnered from sensory transformers, to be overlaid upon a user’s visual field (either in virtual
environments, such as the Oculus Rift, or overlaid on the real world in products such as Hololens). For those with visual
impairments, sensory transforms can be placed into the auditory space of the user through 3D audio technology, and
presented via headphones. In either case functional senses are augmented with information that comes from a technological
transformation of sensory information.
Complementary to sensory substitutions are action substitutions. While perceiving our environment through sensory
channels is an essential capability, acting upon our wishes and needs often requires interacting with and activating devices,
appliances, and systems within our homes, public spaces, and workplaces. The advent of the Internet (or Web) of Things is
enabling non-physical, or virtual, connections to a range of appliances, home automation systems, and devices. This now
enables what we call “action substitution”, in which the device, such as a smartphone, becomes the means of taking physical
actions in one’s environment that would otherwise be difficult or impossible given a physical or mobility impairment.
A prime example of action substitutions comes from home automation technologies such as smart thermostats (i.e.,
Google’s NEST), lighting systems, or smart monitoring and alarm systems. This is an evolution of a technology originally
developed for individuals with disabilities, known as Environmental Control Units (ECU). While the ECU’s would be
considered specialized assistive technologies designed for those with physical or mobility limitations (Dickey & Shealey,
1987), the current trend of home automation is a mainstream consumer product category. When used in conjunction with
accessible smartphones, for example, home automation products offer significant capabilities off the shelf. In the instance of
the thermostat you may not be able to easily control a traditional thermostat on the wall of your home if you are in a wheel
chair or have limited use of your hand, but most smart thermostats allow full control from a smart phone app, when
combined with accessibility features native to the phone. Action substitutions can also occur when a smart system observes
certain behavior or event, and automatically takes action on behalf of the user. With emerging health monitoring systems
used by an individual as well as the health support industry, a connection can emerge between the user and external health
professional, playing a role of a bridge by alerting the physician of events unfavorably impacting health such as a fall, or other
condition monitored via these technologies. The rapid call for assistance would lessen the consequences of such health
events and result in a better outcome for individuals. Commercial emphasis is currently focused on the aging population but
has the potential to increase quality of life for all, the vulnerable potentially unable to get assistance as well as the
constellation of their families, with automated monitoring and notification enabling more independence and less disruption
and stress in the lives of individuals and their families.
Today’s simple vibratory feedback will evolve into sophisticated haptic interfaces to serve as a substitute for hearing by
transforming environmental sounds possibly into tactile cues or utilizing another sense. While people with hearing
impairments may currently use a vibratory alarm in their pillows for fire or other urgent notification, the potential exists to
use vibrotactile cues to transform sound qualities, directionality and rhythm thus serving as an indicator of a need to respond
quickly when awake and away from the pillow alarm. Currently digital pedometers such as fitness and health monitoring
devices used to track daily exercise also record physiological measures such as hour’s asleep, heart rate, calorie intake, water
intake and much more. These devices provide vibrations on a wrist band to serve as notification of reaching a goal and allows
for repurposing the wrist-borne vibrator to generate “silent” alarms to awaken the wearer. Such wearables are only the tip of
the iceberg for this giant change to be brought by insightful development of useful technologies.
Whether information is generated by the user, reporting health data gathered by wearable devices, or received from
outside sources, and then passed from person to person, for conversation, interaction with medical staff, or between
students and teachers, it is anticipated continued miniaturization and blending of technology will place the focus not on the

2
https://www.seeingwithsound.com/.

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx 5

technological tool, but on the task that the user wishes to accomplish. This trend will integrate ever more sophisticated
technologies into devices we use and wear, linking more technological platforms, and evolving an elegant simplicity of use.
Current day, effortful techniques will transform, blurring the lines of what is biologically synchronous technology and what
is actually part of our functioning human body. Health and well-being monitoring can occur in subtle ways. Data could be
continuously collected and analyzed cueing the individual with gentle prods from the wearable technology or more overt
guidance to urge the user to move towards maintaining healthful behavior. Technology could take on the job of a personal
assistant especially for those who may be alone.

4. Trends

What are the trends or themes that underlie all of this? Technology has moved to a smaller and smaller scale to what is
now wearable and will move beyond to possibly what will be considered a technology/human hybrid. With our
interconnectedness increasing with others as well as the support from the Internet of Things, the continuous sensing of a
person and the environment conveying salient information when needed is completely feasible.
With the advent and diffusion of computer and the internet in developing countries the technology landscape for
individuals with visual impairments, for example, has changed significantly. Computers have offered access to the print
material through use of optical character recognition and via screen readers, enabling multiple possibilities in personal and
professional life. The advent of the internet and the web further expanded those possibilities by providing an unprecedented
avenue to knowledge and a means to connect and contribute to society at large. The third major step in the digital revolution,
the development and proliferation of smartphones, is creating real disruption by virtually matching the senses of vision and
hearing, enabling both sensory and action substitutions.
For example, the Apple iPhone is no longer a mere communication device, becoming a way to provide access to visual
information in multiple ways. The TapTapSee3 app allows a visually impaired individual to take a picture of any object and a
remote human assistant describes the object to a person with a visual impairment in real-time, which enables that individual
to indirectly perceive some visual information about the object. The DigitEyes4 app allows the user to scan barcodes and QR
codes, and to then record associated audio descriptions, enabling later identification of objects they may interact with in
their homes or at work. KNFB Reader5 is another innovative app, which allows its users to take a picture of printed text and
then in real-time converts the picture to text using the optical character recognition. The text may then be read to the user via
a device’s screen reader. While such apps may yield imperfect results, depending upon the quality of the source material
scanned, such imperfection is likely to be overcome. These and many other apps are turning smartphones into powerful tools
for those with visual impairments.
Another important trend for individuals with visual impairments is that the cost of assistive technology has decreased
rapidly. Many assistive technologies can be costly, and in some cases exceed to price of the computer it is intended to make
usable by someone with a visual impairment. A commercial screen reader for Microsoft Windows may cost up to $1000.
Now, with the emergence of iOS and Android powered phones with built in assistive technologies, end users can perform
required word processing, internet browsing, email, social networking using their smartphones and without any additional
cost for a screen reader. These advancements have offered unique benefits to the population of those with visual
impairments in the developing countries. Sahasrabudhe and Palvia (2013) studied educational challenges of visually
impaired students in India. Their investigation revealed that, until five years ago, most students with visual impairments
significantly relied upon use of simple technologies such as audio cassette recorders. They also identified affordability as one
major factor for adoption of the assistive technologies in India. Affordability of the technology will be a key factor in diffusion
of the current and emergent technologies among the population of the visually impaired. Apple iOS powered devices are
effective tools for those with visual disabilities yet they are prohibitively expensive for the individuals from developing
countries such as India, which houses the world’s largest population of the visually impaired. Therefore, Android powered
phones which can be affordable to those in developing countries will have significant impact, and provide market
opportunities for improving the assistive potential of the Android app eco-system.
Multimodal interaction is another trend, encompassing new spoken and gestural interaction possibilities. With the
advent of Apple Siri,6 Amazon Echo,7 and Microsoft Cortana,8 even the technology can become transparent, in that
exchanges become more natural, though not fully conversational. Optical sensors that can assess physical state, motion,
gestures, and understand speech change the game for everyone’s relationship with technology.
Prior to the smartphone revolution, modern technological innovations, particularly those developed since the advent of
the computer, have proliferated, meeting the needs of a targeted, average consumer, while generally offering no intentional
support for people with sensory deficits. Nonetheless even early mobile phones provided the potential to enable assistive

3
http://www.taptapseeapp.com/.
4
http://www.digit-eyes.com/.
5
http://www.knfbreader.com/.
6
http://www.apple.com/ios/siri/.
7
http://www.amazon.com/echo.
8
http://www.microsoft.com/en-us/mobile/experiences/cortana/.

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

6 H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx

features whether realized or not. Exclusionary design, otherwise known as a failure to consider accessibility, nonetheless
forces reliance on legacy accommodations, if any exist, forgoing the potential independence provided by newer innovations.
Fortunately, the capability for innovative supports is inherent in the technology of a modern smartphone, and in many cases
just a matter of creating or integrating the required software or web service. Innovative companies developing the next
generations of consumer products need to follow the smartphone model, and consider the benefits to all of their potential
customers. If technologies, currently under development, fail to follow inclusive design approaches and rely upon
proprietary app development models, for example, the needs of people with disabilities will not be met. The benefits of
accessible design and technology that enables adaptation and transformation are clear, as are the negative outcomes of
failing to do so. These negative outcomes include fragmentation that may force those with disabilities to stay with older
technologies while the general population moves to newer, though less accessible, capabilities and features. When
communications tools, such as social media, migrate to newer technology platforms, whole groups of individuals, such as the
deaf may be excluded if a new system is based purely on audio, for example. The negative outcome of fragmented
communities can further exacerbate, for example, crisis communications during events such as natural disasters. Hurricane
Katrina highlighted communications failures, and the need to seek out human sensory transformers, for example via schools
for the deaf where people with hearing impairments could find knowledgeable support aware of their specific needs. Thus,
negative outcomes include fragmented communities of those with disabilities, not integrated within dominant
communications systems, and dependent upon human intermediaries to transfer and translate information.

5. Future snapshots

A transformational future is what people with disabilities may be moving towards if technology is able to remove the
barriers that slow or impede communication between individuals, products, and services. Acceptance of technological
patches to repair impairments will likely become commonplace. When the technologies are based on mainstream consumer
products, the technology itself can become transparent in its use as an assistive technology. Specific technologies or product
features initially designed for and used largely for one task by people without disabilities will regularly be leveraged to
enable accessibility and support for those with disabilities. When assistive devices were more specialized or single purpose,
lacking the industrial/aesthetic design common in mass produced consumer products, there was potential stigma associated
with their use. The blending of assistive technologies within mainstream consumer products reduces the stigma that may be
traditionally associated with specialized assistive technologies (Shinohara & Wobbrock, 2011). Technological aids based on
mainstream products, whether required, adopted by choice or as a matter of convenience, will be the new norm. People will
move towards experiencing a seamless interaction with others and their environment in general through the novel use of
augmenting tools. As the path of these developing technologies grows, some will inevitably fail while others will be widely
accepted because they positively impact and improve quality of life. The beneficiaries of such transformations, initially
designed for people with sensory impairments, may be the greater population at large. Much as the original curb cut saw
broad benefit for all who traverse city streets, and closed captions on broadcast television aided language learners and those
in noisy environments, emerging sensory and action transformation will likely find broader use beyond their originally
intended audiences.
Some possible scenarios for three representative groups follow. These will serve as potential snapshots of the future.

5.1. Hearing

In the future, whether devices utilize apps, or more likely become a platform of personal services and functions that are
seamless, they will have the capability to serve as sensory transformers or substitution mechanisms. Individuals with
impaired or no hearing may use the devices to communicate with others using sign language, via a camera and gesture
recognizer; obtain updates for their digital cochlear implants, upgrading or improving the software to allow for enhanced
hearing; or transforming the spoken word into text narratives. With wearable technology, such as Google Glass, spoken
dialog will be transformed into text captions visible only to the wearer. Students in the classroom who are deaf or hearing
impaired would have barriers removed or significantly lowered by utilizing wearable speech processors that provide
realtime captions.
With transformations, people with hearing impairments might be envisioned as being aware of their environments
through some mimic of the normal hearing process or by mapping the function of the ear to another sense, such as a
vibrotactile coding mechanisms embedded within wearable devices such as wrist bands.
Technological improvements are also affecting the cochlear implant. Once highly visible to all, it may become virtually
hidden on the user. Earlier implants were once more prone to failure or required replacement to address functional
improvement. Now, management of software upgrades is carried out through an app. In the future, those fitted with such a
device may manage adjustments through a wireless connection with muscles that make fine adjustments with effort no
greater than a momentary tightening of a set of facial muscles.
If the technology is introduced at the onset of hearing loss, or early in life for a child born with reduced or no hearing, their
individual experience in the world may not realize significant impact from the impaired sensory input as a result of
technological substitution. The potential is that the individual and those interacting with them would not be aware of any
differences.

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx 7

5.2. Seeing

People with vision impairments can anticipate a future with a bionic eye equivalent of the cochlear implant or the
conversion of visual stimuli processing into a different modality. Greater independence would ideally result. People with
visual impairments will adopt any move in this direction, brought about through emerging technologies, much in the same
way many have gravitated to the smartphone.
In the future we can expect an eco-system of multiple apps, where each of the apps individually enables individuals with
visual impairments to accomplish a particular task, and the entire gamut of apps enables the individual to mitigate a visual
deficit to a significant extent. That ecosystem will produce a scalable mechanism to match if not to completely substitute the
sense of vision.
Navigation in built environments will be facilitated by enhanced location-based information systems. Bluetooth beacons
are one example, which when augmented by wireless positioning techniques, GPS, and 3D depth sensors, enable greater
independence for those with visual impairments.

5.3. Cognitive

People with cognitive impairments would be able to understand information that currently is unintelligible. Examples of
some of these solutions include simplifying language or transformation into symbology for those recovering from a stroke,
for example, impacting part of the speech processing centers of the brain. Individuals with learning disabilities can receive
individualized re-coding of information tailored to specific conditions such as dyslexia. Reminders and automated logging of
daily activities can be combined with simplified presentation of guidance and reminders for an elderly population with
dementia.
Can this future technology carry out the process of filtering inputs and performing the task of adaptation to meet an
individual’s needs? Much as in acoustics, filters and amplifiers can remove noise and accentuate salient audio bands;
emerging information filters and amplifiers for visual information can remove “clutter” and “noise” in content and
accentuate key information.

5.4. Mobility

For people with mobility or dexterity impairments, controlling one’s environment will become possible through
standard, wireless interfaces connecting to assistive interfaces, as well as through consumer products supporting
interactions such as speech or gestures. While this had been anticipated for some time (Myers, Nichols, Wobbrock, & Miller,
2004), and standards have been under development in the assistive technology domain (Zimmermann et al., 2004), it has
been the consumer product field that has driven the foundational technology forward.
Gone are the days when specialized physical or electronic interfaces were required to have limited control of appliances,
heating/cooling, lights and entertainment. Standards-based home networks will enable interaction via a range of devices
and modalities. We have seen the availability of consumer technologies allowing spoken or gestural control using home
automation products. Homes are becoming networked environments, with lighting, entertainment, door locks, security
features, kitchen appliances and heating/cooling controlled from smart phones or new devices with spoken interfaces.
Physical contact with appliances and other objects in your home is no longer required. If you find your home too warm, you
can command products such as Amazon’s Echo (see Fig. 2) in natural language: “Alexa, lower the temperature to 70 .” If you
are moving in your wheel chair from one room to another at night, you can issue a command such as “Alexa, turn on the living
room lights.” Additional sensors and environmental controls will be important parts of future homes, detecting that you
have entered a room.

5.5. The impact on individuals and society

Each of the disability groups described in the prior sections would benefit from the availability of the technologies
described. This future would move us away from technology that draws attention to a disability and make commonplace the
fact that an individual with a disability is accomplishing some task, communicating with others or enjoying a book using the
same consumer product as their peers. Developments of this nature can reduce the impact of stress and anxiety associated
with the challenges specific to each category of sensory or cognitive impairment, and support from others may be more
readily attained when devices share common features. Though there may be differences in the exact method of interaction,
those with and without disabilities will share common products, goals and experiences. The ability of individuals to use
these technologies to reduce or eliminate the functional limitation of a sensory, cognitive, or motor impairment and enable
them to more readily, and independently, participate in work, school, and society would clear away needless challenges and
barriers to participation that occur today. And as earlier accessibility solutions have shown (e.g., the curb cut), these
technological innovations may prove of use to all, providing engaging and novel means of interacting with the world and
with others in it. Looking ahead into the future, we may see a blurring of the differences that an individual perceives when
encountering someone with a sensory, cognitive and or motor disability, with technology eliminating any obvious functional
limitation when contrasted with those without disabilities engaged in similar activities.

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

8 H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx


[(Fig._2)TD$IG]

Fig. 2. A [3_TD$IF]photo of the Amazon Echo, a cylindrical device, black in color, with an illuminated blue ring encircling the top. (For interpretation of the references
to color in this figure legend, the reader is referred to the web version of this article.).

6. Conclusion

Predicting the future in any domain is difficult, and when technology is involved, predictions can often be far off from
eventual outcomes. But the rate of technological innovation, coupled with the rate of adoption of those innovations by
consumers, for example cell phones, has been world changing. The incorporation of sophisticated computational and sensor
technologies in consumer devices, combined with the adaptable nature of software (or apps) that drive these devices,
strongly suggests a future that holds tremendous opportunity for enhancing the lives of those with sensory, cognitive, and
physical disabilities. We feel that key enablers of such a future will be the continuation of cost reduction for both devices and

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006
G Model
JFTR 2145 No. of Pages 9

H.T. Sullivan, S. Sahasrabudhe / Futures xxx (2016) xxx–xxx 9

data services, support by device manufacturers for accessible or universal design in their products, open hardware and
software platforms that enable individual developers to take advantage of connectivity to external devices and services, and
the continuing ingenuity of individuals who can see the potential in these devices to create innovative solutions to meet the
needs of those with disabilities. Direct benefit for those with disabilities will come from the ability of these devices to
function as sensory and action transformers, which will enable, for example, individuals to receive information and
communicate even when a given sensory channel is impaired. These enablers are present today and will continue to grow in
capability and availability in the future. The question remains whether the underlying platforms and devices will sustain the
adaptable and open nature of today’s technology that is fostering such significant innovation and change impacting those
with disabilities.

[8_TD$IF]Acknowledgements

This paper is the outcome is the outcome of an initial position paper developed for the Envisioning Inclusive Futures
Workshop organized by the Wireless RERC at Georgia Tech. Dr Sullivan wishes to acknowledge the support provided by the
RERC for her participation in this fascinating workshop, and in particular, thanks Dr. Paul Baker for his encouragement in
producing this paper.

References

Aeppel, T. (2015). It took the telephone 75 years to do what Angry Birds did in 35 days. But what does it mean? The Wall Street Journal, 13(March) Retrieved
from the web: http://blogs.wsj.com/economics/2015/03/13/it-took-the-telephone-75-years-to-do-what-angry-birds-did-in-35-days-but-what-does-
that-mean/[9_TD$IF]http://blogs.wsj.com/economics/2015/03/13/it-took-the-telephone-75-years-to-do-what-angry-birds-did-in-35-days-but-what-does-
that-mean/.
ATIA, 2015. What is AT? Assistive Technology Industry Association. Retrieved from: https://www.atia.org/at-resources/what-is-at/.
Bach-y-Rita, P., Collins, C. C., Saunders, F., White, B., & Scadden, L. (1969). Vision substitution by tactile image projection. Nature, 221, 963–964.
Bonnington, C. (2015). In less than two years, a smartphone could be your only computer. Wired. 2.10.15. Retrieved from the web: http://www.wired.com/
2015/02/smartphone-only-computer/.
Bouwman, H., & Van Der Duin, P. (2007). Futures research, communication and the use of information and communication technology in households in
2010: a reassessment. New Media & Society, 9(3), 379–399.
Degusta, M. (2011). Android orphans: visualizing a sad history of support. The Understatement, 26(October), 6.
Dickey, R., & Shealey, S. H. (1987). Using technology to control the environment. American Journal of Occupational Therapy, 41(11), 717–721.
Duebendorfer, T., & Frei, S. (2009). Why silent updates boost security. TIK, ETH Zurich Technical Report302.
Economist (2015). The truly personal computer. The Economist, 28(February) Retrieved from the web: http://www.economist.com/news/briefing/
21645131-smartphone-defining-technology-age-truly-personal-computer.
Greengard, S. (2015). Automotive systems get smarter. Communications of the ACM, 58(10), 18–20.
ITU (2015). ICT facts and figure: the world in 2015. International Telecommunications Union Retrieved from the web: [1_TD$IF]http://www.itu.int/en/ITU-D/Statistics/
Pages/facts/default.aspxhttp://www.itu.int/en/ITU-D/Statistics/Pages/facts/default.aspx.
Jain, D., Findlater, L., Gilkeson, J., Holland, B., Duraiswami, R., Zotkin, D., et al. (2015). Head-mounted display visualizations to support sound awareness for
the deaf and hard of hearing. Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 241–250)..
Li, Y., Banerjee, T., Popescu, M., & Skubic, M. (2013). Improvement of acoustic fall detection using Kinect depth sensing. Engineering in medicine and biology
society (EMBC), 2013 35th annual international conference of the IEEE (pp. 6736–6739)..
Miller, D. (2013). The Orphans of Android. Wired. Retrieved from the web: http://www.wired.com/insights/2013/01/the-orphans-of-android/.
Myers, B. A., Nichols, J., Wobbrock, J. O., & Miller, R. C. (2004). Taking handheld devices to the next level. Computer, 31(12), 36–43.
Sahasrabudhe, S., & Palvia, P. (2013). Challenges of blind students and IT-Based mitigation strategies. AMCIS 2013 proceedings.
Shinohara, K., & Wobbrock, J. O. (2011). In the shadow of misperception: assistive technology use and social interactions. Proceedings of the SIGCHI conference
on human factors in computing systems (pp. 705–714)..
Techopedia. (2015). Evergreen Web Browsers. Retrieved from the web: http://www.techopedia.com/definition/31094/evergreen-browser.
Van Krevelen, D. W. F., & Poelman, R. (2010). A survey of augmented reality technologies, applications and limitations. International Journal of Virtual Reality,
9(2), 1.
Zimmermann, G., Vanderheiden, G., Ma, M., Gandy, M., Trewin, S., Laskowski, S., et al. (2004). Universal remote console standard: toward natural user
interaction in ambient intelligence. CHI’04 extended abstracts on human factors in computing systems (pp. 1608–1609)..

Please cite this article in press as: H.T. Sullivan, S. Sahasrabudhe, Envisioning inclusive futures: Technology-based assistive
sensory and action substitution, Futures (2016), http://dx.doi.org/10.1016/j.futures.2016.06.006

You might also like