Professional Documents
Culture Documents
Schubert - SwitchingWorlds - Englisch (Dragged) 12
Schubert - SwitchingWorlds - Englisch (Dragged) 12
65 “Resonating with Martin Heidegger’s (1977) notion that the essence of technology is revealed in its breakdown,
the glitch reveals aspects of the technology, it draws attention to its structure, its opaque quality, the fact that it is
designed and has materiality.”
Cascone, Kim: ‘The Aesthetics of Failure: "Post-Digital" Tendencies in Contemporary Computer Music’,
in: Computer Music Journal 24/4, 2000, pp.12f.
66 “The post-digital seeks to lift the veil of the technical, to find ways of being expressive using inherent structures,
processes and other affordances.”
ebd.
67 Jenkins, Henry: ‘Confronting the Challenges of Participatory Culture: Media education for the 21st century’,
in: Cambridge Massachusetts: MIT Press, 2009,pp.9ff.
34
himself says that no art can be freed from its relationship to its environment. This
is certainly true, but the emphasis can still be stated.
Twenty years after the publication of Cascone's text, it seems reasonable to
rethink the function, implication and connotation of the (digital) error and to examine
it for new potentials in light of the changed conditions.
The malfunctioning of a system can open our eyes to the mechanisms behind
technology or refine our view of its use in society. The following questions take
centre stage:
Regarding point 1: errors are not only an artistic tool but occur naturally, especially
in complex systems. Justin Hodgson uses the example of two computer system
failures to describe how digital errors can lead us to rediscover otherwise invisible
or unnoticed structures and reflect on their effects:
For example, on July 8, 2015, a technical glitch grounded, for the entire day, all
United Airlines flights in the US. That same day, the New York Stock Exchange and
the Wall Street Journal websites went down as well—also glitch-related. In addition
to demonstrating how digital disasters have the potential to operate with the
magnitude of a natural disaster, these events also revealed the scary reality that
many major corporate glitches and computational mishaps are, as informatics
scholar Zeynep Tufekci explained in “Why the Great Glitch of July 8th Should
Scare You”.68
These are two extreme examples, addressing the fragility of the digital system.
A susceptibility to error goes hand-in-hand with the weaknesses of capitalist
globalization. The small errors that we encounter in everyday life are, perhaps, even
more exciting. Justin Hodgson uses a faulty ATM to describe how its defect lets us
peek behind the seemingly perfect façade of a banking company:
Anyone who has seen a famous “Blue Screen of Death”—the iconic signal of a
Microsoft Windows crash—on a public screen or terminal knows how errors can
thrust the technical details of previously invisible systems into view. Nobody
knows that their ATM runs Windows until the system crashes. Of course, the
operating system chosen for a sign or bank machine has important implications for
its users. Windows, or an alternative operating system, creates affordances and
imposes limitations. Faced with a crashed ATM, a consumer might ask herself if,
with its history of rampant viruses and security holes, she should really trust an
ATM running Windows.69
68 Hodgson, Justin: Post-Digital Rhetoric and the New Aesthetic.
Columbus: The Ohio State University Press, 2019, p.6.
69 Hill, Benjamin Mako: ‘Revealing Errors’, in: Error: Glitch, Noise, and Jam in New Media Cultures, Mark Nunes,
New York: Continuum, 2011, p.27.
35
An error can therefore make a closed system – at least partially – understandable.
Through the termination, we learn something about the system behind it (in the
example above, the operating system). A look behind the facade of a digital
interface can tell us firstly, something about the medium itself, as described in
Chapter 2.1.2, and secondly, something about its basic properties, such as the
computational capability. At a more elevated level, it tells us something about
fragility, narration, appearance and control. A closed system can claim something,
represent something. It has a certain authority and power. These media/interfaces
always function as a black box in the digital world. The link in the programming is
not physically visible; it can be designed in any way and can, therefore, initially only
be experienced by the user based on its graphic design (GUI). If a system is not
visible, not understandable, not controllable, then it has a hierarchical component,
which Benjamin Hill summarizes as follows:
As technologies become more complex, they often become more mysterious to their
users. While not invisible, users know little about the way that complex technologies
work both because they become accustomed to them and because the technological
specifics are hidden inside companies, behind web interfaces, within compiled soft-
ware, and in “black boxes”. Errors can help reveal these technologies and expose
their nature and effects. As technology becomes complex, the purpose of technology
is to hide this complexity. As a result, the explicit creation of black boxes becomes
an important function of technological design processes and a source of power.
Once again, errors that break open these boxes can reveal hidden technology
and its power.70
The manifestation of this digital control can have many facets. The malfunctioning
of a digital system can reveal fragilities, power structures, manipulations, con-
ventions, labels and forms of representation. It is the surface that can be broken.
To recognize that this surface exists, and that (under certain circumstances) an
unreflected trust is placed in it, is then the supporting achievement. A disappearing
black box is made visible again and its structure is potentially recognizable.
70 Hill, Benjamin Mako: ‘Revealing Errors’, in: Error: Glitch, Noise, and Jam in New Media Cultures, Mark Nunes,
New York: Continuum, 2011, p.36.
36