Professional Documents
Culture Documents
Image and Practice: Visualization in Computational Fluid Dynamics Research
Image and Practice: Visualization in Computational Fluid Dynamics Research
Introduction
The spread of computer simulation throughout the sciences is well
documented (see, for example Kaufmann III and Smarr 1993). It has been
subject to a great deal of interest among philosophers, historians and
sociologists (for example, de Landa 2011; Galison 1996; Morgan and Morrison
1999). Understanding simulation is of great importance in how we
understand processes at all scales and in all time frames. It is absolutely
central to the study of contemporary problems such as climate change and
(Strathern 2004, xxi). If one were to ‘zoom in’ from a grander consideration of
science to a more detailed analysis of its sites of accomplishment, a new
terrain would emerge, which would in turn invite scrutiny at new levels. This
is something that my informants at AMCG are well aware of, for modelling is
always a matter of choosing a resolution. Whatever the scale of your grid,
there are always below grid-scale phenomena that are impossible to explicitly
resolve. Likewise, my analysis of AMCG, far from simply ‘filling in the gaps’
missing in grander theories of science, draws the eye to new kinds of
discontinuities: the gaps and leaps in play within investigative practices.
Images, as well as the apparatus of their production and distribution, form a
central aspect of the material assemblages in which these practices are carried
out, assemblages in which these practices gain their rhythm and reason, what
we might, following Hans-Jörg Rheinberger, call ‘experimental systems’
(Rheinberger 1996). In what follows, I aim to trace images through such
Published by Maney Publishing (c) IOM Communications Ltd
the course of being erected, in one and the same movement, blind and guide the
experimenter. In the step-by-step construction of a labyrinth, the existing walls
limit and orient the direction of the walls to be added. A labyrinth that deserves
the name is not planned and thus cannot be conquered by following a plan. It
forces us to move around by means and by virtue of checking out, of groping, of
tâtonnement’. (Rheinberger 1997, 74)
It would be a mistake, however, to suppose that marketing images could be
taken to stand for visualization in general. If marketing images constitute one
specific mode of imagery, very different modes of imagery are essential at the
level of improvization, right inside the processes that were eclipsed by
marketing. There are many images present within research that do not retain
attention within their own capacities as images, but rather pull it through
them, beyond them towards the emerging future of the ongoing project. Here
any representational capacities are implicit; ‘the Image’ does not appear as a
source of problems, because images are thoroughly embedded in technical
milieux of code, equations, statistics, equipment and procedures, drawn
together around a temporally unfolding research project, a background from
which neither images nor any specifically visual dimension are routinely
extricated or isolated.
There are very good reasons to be wary about equating visualization with
the production of visual sensation, of shapes or colours. Martin Heidegger
captures this, though writing of the ears: ‘Much closer to us than all
sensations are the things themselves. We hear the door shut in the house and
never hear acoustical sensations or even mere sounds’ (Heidegger 2001, 25).
What is significant in moments of investigation is what is seen in the image,
moments in which the image itself disappears, in which what is significant is
less ‘what is seen in the image’, than simply ‘what is seen’, what is
encountered. Mediation is not the issue and thus not the message. Much
closer to the scientist than all images is the problem at hand, the simulation,
the data, a bug in the code, or its potential solution. Present representation is
less of an issue than future realization.
recent years become increasingly concerned with process and becoming, and
in a sense my analysis is indeed intended to take us in this direction. Andrew
Pickering, for example, claims that ‘we live in the thick of things, in a
symmetric, decentred process of the becoming of the human and the non-
human. But this is veiled from us by a particular tactic of dualist detachment
and domination that is backed up and intensified. . . by science as our
certified way of knowing’ (Pickering 2008, 8). Yet this vision of science itself
requires a ‘zoomed out’ view that brackets off the improvizations and
processes through which research itself is carried out, for the insight of
science studies has been that if we look in at the detail of practical research
we see that ‘science is itself caught up in the flow of becoming . . .’ (Pickering 2008,
8).
Emphasis on flows and processes is often achieved through the
construction of a contrast with a world of ontic beings and static structures,
Published by Maney Publishing (c) IOM Communications Ltd
for example between on the one hand the inscription of the marketing image
which presents the model as finalized and on the other hand the research
process that is always ongoing. In the context of talking about anthropological
narrative, but in words that could equally apply to the navigation of research,
Strathern points to a potential naivety here. The danger is that these flows
and processes are imagined as smooth, a smoothness which is an artefact of
the contrast drawn for rhetorical purposes, rather than a feature of the
process itself.
‘Ideas and arguments are often regarded as “flowing”. . . The time it might take to
travel, as the reader moves through the text, gives a kind of experiential unity to
the exercise. Yet this unity or sense of flow or movement is at the same time made
up of jumps over gaps, juxtapositions, leaps — unpredictable, irregular. So, con-
tinuous as the process of narration might seem, the closer we inspect monographs,
paragraphs, sentences, the more aware we are of internal discontinuities’ (Strathern
2004, xxiii).
Scientists at AMCG inhabit a world of the interplay of the continuous and
discontinuous, between the continuity of the Navier-Stokes equations for an
ideal fluid, and the discontinuity of the molecular dynamics that the ideal
fluid overwrites for macro-scale phenomena, between these continuous
equations and the discrete mathematics of the digital computer, for which
they must be discretized if they are to written into code, and between the
continuities and discontinuities of the numerical approximation and the
necessity of modelling continuities and discontinuities within the simulated
system. Each is implicated, rendering it impossible to idealize continuity as
representing some more authentic essence of process. Discontinuity is not the
antithesis of process, but is endemic within it, folded many times over.
Images in the depths of the research process are significant in their relation
to its movement, manifest at moments of unpredictable leaps, of irregular
jumps, moments where the ongoing path of investigation is textured by
discontinuity, gaps where what comes next does not automatically follow, but
requires some additional force. Images are fodder for habit and
improvization, the material to be grasped in order to take these steps.
Somewhat like stepping stones, points of hardness at odds with the ongoing
movement, without such material artefacts that movement would not be able
to push onwards towards its future. They are points at which what follows
unfolds. As Bourdieu was well aware, even when the question of ‘what comes
‘The point is not to run it for a week and then check at the end of the week and
find it was wrong after day one, so that is why I keep on checking’. (IW)
Looking at a visualization, scientists can often tell that something is going
wrong, because problems are manifest in the image such that it intuitively
‘looks wrong’. Indeed it is claimed that non-scientists would have the same
response if erratic behaviour in the model output leads to a visualization that
simply doesn’t resemble everyday experiences of fluids.
‘If the water moves through the solid object rather than around it, I know there is
a problem, because that is just not physical’. (AB)
In other cases, however, errors are less obvious and the scientist relies on
long experience in working with this kind of simulation, experience which
conditions the space of practice opened by the visualization, which is often
the place for inclinations to linger, to double check some settings, to run an
Published by Maney Publishing (c) IOM Communications Ltd
mesh, SS felt a good indication that the problem was something to do with
the adaptivity.
That it is a local problem, affecting only a sub-set of elements (you can
clearly see a localized section of the boundary markedly relaxing its
resolution in Figure 2) raised a strong suspicion that the problem resides at a
deep level: in the parallel processing of the simulation. These finite element
simulations are designed to run on multiple processors by dividing up the
elements within the domain (the triangles in the mesh) so that each processor
handles a small sub-set. It seemed likely therefore that something was
causing just one processor to behave strangely. This points SS towards further
pathways, checking that this was not a hardware malfunction by running a
repetition of the test, and adding weight to her suspicion by trying it in serial.
That the investigation heads in this direction takes the issue further from
SS’s own domain of expertise, and having exhausted her own ideas for what
would be causing the processor to go astray she took the issue to a weekly
meeting, which is usually attended by several scientists who are specialists in
parallel processing. Having the problem externalized in these visualizations
had facilitated the process of investigation that pointed towards it being a
parallel processing issue, and then later these images can be brought into the
communal forum to catalyse a dialogue with others, enabling the SS to
marshal assistance to help find a way through the technical system towards
the source of the problem, a common tactic within a group with a strong
communal dynamic. Images tip the pathway towards new courses of action,
new tests, new forms of scrutiny, new suspicions. But they also bring
individuals together.
‘It is fairly common to go to others to see if they recognise what is going wrong.
Not just in visualization, but if the compiler fails and someone comes to me and
I recognise the error message and know how to fix it. . . same thing with visualiza-
tion. If it goes wrong and the velocities are pointing downwards I have seen a
similar thing: “I saw this. . . I did this. . . it should work”’ (QS)
It is not pure goodwill that SS relies on for help. There are bugs in all
complex software, some of which can be very difficult to analyse and fix. It is
in all the scientists’ best interests to fix bugs when they appear. The bug’s
Published by Maney Publishing (c) IOM Communications Ltd
Becoming a scientist
The research process does many things. It produces knowledge and generates
complex software systems. Indeed, as simulations become more complex, it
becomes harder to communicate research through traditional media of
scientific publication, leading to the kind of open source model used for
Fluidity, which aims to facilitate the direct dissemination of the scientific
apparatus, along with the results and the discourse upon them. Yet further to
and working with and through a great many different images, working
through them in the same manner and mode, according to the same kinds of
code, the same kinds of physics, and the same kinds of problems, sensitivities
are established. Through these, research is regular, yet this regularity opens
up the space for improvization, following barely grounded suspicions to
investigate further, suspicions which may well later confirm themselves, but
this confirmation only emerges through having already had to act, to follow a
suspicion when it was no more than a vague feeling. Those that come to
nothing constitute departure points for further exploration while those that
are confirmed shed their vagueness, and establish a retroactive teleology that
collapses the discontinuity of the decision into the unity of the continuing
process. The form of the scientific everyday is not stamped upon it by some a
priori definition of the correct scientific method, but grows instead out of the
steady repetitive processes of research.
Published by Maney Publishing (c) IOM Communications Ltd
investigation, what is encountered (in the image) also exerts its own kinds of
influences on how and what may be realized in that near future, while
traversing a field of practice itself defined by a stratigraphy of such tracings,
its cultivation the intrinsic historicity of research sites. The image, I said, is
among these processes a point at which what follows unfolds, emerging out
of a reflexive entanglement of the process upon itself, a knot sufficient to
grasp the future sufficiently to bring it about, and to realize the project of
which it was born.
Acknowledgements
Thanks to the organisers of the ‘Visualization in the Age of Computerisation’
conference at the Said Business School in Oxford for the opportunity to
present an early version of this research, and to the Arts and Humanities
Research Council for generous financial support for the project. Thanks also
to Celia Lury and Victoria Newton for their valuable comments.
Note
1
There is a big commercial sector in the provision on a more commercial scale, and invest substan-
of simulation services, especially in engineering tial attention in the efficacy of their simulations’
applications. Companies such as Ansys do simi- marketing materials (see for example the
lar kinds of work to that done at AMCG, albeit brochures available at http://www.ansys.com/).
Bibliography
Bourdieu, Pierre. 1977. Outline of a Theory of Practice. Trans. Richard Nice. Cambridge: Cambridge
University Press.
Brooks, Jr, Frederick P. 1995. No Silver Bullet: Essence and Accident in Software Engineering. In
The Mythical Man-Month: Essays on Software Engineering, 177–203. Anniversary Edition. Boston:
Addison-Wesley.
Daston, Lorraine, and Peter Galison. 2007. Objectivity. New York: Zone Books.
Deleuze, Gilles. 2004. Difference and Repetition. Trans. Paul Patton. London: Continuum.
Galison, Peter. 1996. Computer Simulations and the Trading Zone. In The Disunity of Science: Boundaries,
Contexts, and Power, ed. Peter Galison and David Stump, 118–157. Stanford, CA: Stanford University
Press.
Heidegger, Martin. 2001. The Origin of the Work of Art. In Poetry, Language, Thought, trans. Albert
Hofstadter, 15–86. New York: Perennial Classics.
Ihde, Don. 1999. Expanding Hermeneutics: Visualizing science. Evanston: Northwestern University Press.
Ingold, Tim. 2001. Beyond Art and Technology: The anthropology of skill. In Anthropological Perspectives
on Technology, ed. Michael Schiffer, 17–31. Albuquerque: University of New Mexico Press.
Kaufmann III, William J., and Larry L. Smarr. 1993. Supercomputing and the Transformation of Science.
New York: Scientific American Library.
de Landa, Manuel. 2011. Philosophy and Simulation: The Emergence of Synthetic Reason. London:
Continuum.
Latour, Bruno, and Peter Weibel. 2002. Iconoclash: Beyond the Image Wars in Science, Religion and Art.
Cambridge: The MIT Press.
Lynch, Michael, and Steve Woolgar, eds. 1990. Representation in Scientific Practice. Cambridge. The MIT
Press.
Morgan, Mary, and Margaret Morrison. 1999. Models as Mediators: Perspectives on natural and social
sciences. Cambridge: Cambridge University Press.
Published by Maney Publishing (c) IOM Communications Ltd
Pickering, Andrew. 2008. New Ontologies. In The Mangle in Practice: Science, society, and becoming, ed.
Andrew Pickering and Keith Guzik, 1–14. Durham: Duke University Press.
Rheinberger, Hans-Jörg. 1996. Comparing Experimental Systems: Protein Synthesis in Microbes and in
Animal Tissue at Cambridge (Ernest F. Gale) and at the Massachusetts General Hospital (Paul C.
Zamecnik), 1945–1960. Journal of the History of Biology 29(3) (October 1): 387–416.
Rheinberger, Hans-Jörg. 1997. Toward a History of Epistemic Things: Synthesizing proteins in the test
tube. Stanford: Stanford University Press, November 30.
Rosenberger, Robert. 2009. Quick-Freezing Philosophy: An analysis of imaging technologies in
neurobiology. In New Waves in Philosophy of Technology, ed. Jan-Kyrre Olsen, Evan Selinger, and Søren
Riis, 65–82. New York: Palgrave Macmillan.
Simondon, Gilbert. 2009. The Position of the Problem of Ontogenesis. Trans. Gregor Flanders. Parrhesia
7: 4–16.
Strathern, Marilyn. 1990. Artefacts of History: Events and the interpretation of images. In Culture and
History in the Pacific, ed. Jukka Siikala, 25–44. Helsinki: Finnish Anthropological Society.
Strathern, Marilyn. 2004. Partial Connections. Updated edition. Walnut Creek: AltaMira Press.
Winsberg, Eric. 2010. Science in the Age of Computer Simulation. Chicago: University of Chicago Press.
Wittgenstein, Ludwig. 2009. Philosophische Untersuchungen/Philosophical Investigations. Trans. G. E. M.
Anscombe, P. M. S. Hacker, and Joachim Schulte. Rev. 4th ed. Chichester: Wiley-Blackwell.
Notes on contributor
Matt Spencer is a PhD student and Visiting Tutor at Goldsmiths Centre for
Cultural Studies in London, UK. His research principally concerns social
scientific and philosophical theories of rationality and representation, as well
as software development and the epistemology of computer simulation. Since
the summer of 2010 he has been undertaking ethnographic research with the
Applied Modelling and Computation Group at Imperial College in London.
Correspondence to: Matt Spencer, email: cu701ms@gold.ac.uk;
mspencer421@googlemail.com.