Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 796

Our machines are disturbingly lively, and we ourselves frighteningly inert.

(Donna Haraway 1991a:


152) Haraway, Donna (1989/1991) 'The Biopolitics ofPostmodern Bodies', in Haraway 1991b: 203-30
. (1991a) 'A Cyborg Manifesto', in Haraway 1991b: 149-81. (1991b) Simians, Cyborgs, and Women:
The Reinvention qfNature (London: Free Association Books).

... the Human Body is a Machin of an infinite Number and Variety of different Channels and Pipes,
filled with various and different Liquors and Fluids, perpetually running, gliding, or creeping forward,
or returning back ... (George Cheyne 1733/1976: 3-4) Cheyne, George (1715) Philosophical Principles
ofReliBion (1705) 2nd edn (London). (1733/1976) The EnBlish Malady: or, A Treatise of Nervous
Diseases of All Kinds ... (London: Strahan and Leake; repro New York: Scholars' Facsimiles and
Reprints).

"The definition of man's uniqueness has always formed the kernel of his
cosmological and ethical systems. With Copernicus and Galileo, he ceased
to be the species located at the center of the universe, attended by sun and
stars. With Darwin, he ceased to be the species created and specially
endowed by God with soul and reason. With Freud, he ceased to be the
species whose behavior was - potentially - governable by rational mind. As
we begin to produce mechanisms that think and learn, he has ceased to be
the species uniquely capable of complex, intelligent manipulation of his
environment."

"I am confident that man will, as he has in the past, find a new way of
describing his place in the universe - a way that will satisfy his needs for
dignity and for purpose. But it will be a way as different from the present one
as was the Copernican from the Ptolemaic." (Simon, 1965)

Simon, H. A. (1965). The shape of automation for men and management. New York: Harper
& Row

Diffusion of innovations also entails “interaction between life-experience and history” (Erben 1993)
and hence theories from sociology may help explain some aspects of the diffusion dynamics. For
instance, the cultural imperialism hypothesis (Tomlinson 16 1991) – western control of mass media
combined with human desire to improve their lives leads consumers from developing countries to
imitate the developed ones

“As a consumer, I can imagine another way of selling television sets, but I am not sure that it is
practicable. Television could be offered to the public frankly as an experiment. The price of the
receiver would be high and the amount of entertainment at the beginning would be limited yet such a
conditional invitation is a possibility. The buyer would be warned in advance that after a specific
period the apparatus might be useless that is, he would gamble; as a pioneer he would take his risk for
the pleasure of being among the first to enjoy the new entertainment. This would make for flexibility
and would give both the engineer and the promoter an additional period for experimentation;
moreover it would remove early television from the dominance of the sponsor, who would naturally
tend to freeze the system at a given level. In a sense there is a choice between giving a limited
guarantee to the public and giving it to the sponsor. Each of these will pay back part of the investment
in television, the public by buying sets and the sponsor by buying time on the air; each will wait upon
the other; and the promoters will have to make delicate decisions.” Gilbert Seldes - The "Errors" of
Television - The Atlantic - May 1937

026463 1
Since diffusion of innovation requires an understanding of “people, hardware, software,
communication networks and data resources that collects, transforms, and disseminates information”
(O’Brien 1996) The diffusion of a technology is negatively related to the disparity between the skills
needed to use the technology and the consumer’s existing knowledge (Gatignon and Robertson 1985).

O'Brien, J. A. (1996), Introduction to Information Systems, 8th edition, Illinois: Irwin/McGraw Hill.

Gatignon, H. and T. S Robertson (1985) “A Propositional Inventory for New Diffusion Research”,
Journal of Consumer Research, 11, 849-867.

Dd5 1ed.

Netflix, Amazon, Sky, Sony and Warner Brothers have joined together
to form a group called ACE, the Alliance for Creativity and
Entertainment, which brings together the biggest names in global
media. ACE’s website states that it “supports cooperation with law
enforcement agencies around the world to advance these measures
and address theft of creative works”.

“The bottom line is that when it comes to piracy, Kodi and its third-party
‘pirate’ addons are so good at what they do, it’s no surprise they’ve
been a smash hit with Internet users. All of the content that anyone
could want – and more – accessible in one package, on almost any
platform? That’s what consumers have been demanding for more than
a decade and a half.

That brings us to the unavoidable conclusion that modified Kodi simply


got too good at delivering content outside controlled channels, and that
success was impossible to moderate or calm. Quite simply, every user
that added to the Kodi phenomenon by installing the software with
‘pirate’ addons has to shoulder some of the blame for the crackdown.”

Who’s To Blame For The Kodi Crackdown?

BY ANDY ON JUNE 11, 2017

https://torrentfreak.com/whos-to-blame-for-the-kodi-crackdown-170611/

n some ways, this is a big win for copyright owners. Shutting down the sources of illegal
streams is a top priority for the UK Federation Against Copyright Theft (FACT), the trade
organisation for the protection of intellectual property.

026463 2
USERS AS CURRENCY: TECHNOLOGY AND MARKETING TRIALS AS
NATURALISTIC ENVORMENTS

DEREK WILLIAM NICOLL

All over the country in supermarkets, in barber shops, doctors’ offices, book
stores, even in the Jury room at the municipal court house in San Jose,
California. Well naturally, we have some questions; where do these games
come from? What effect do they have on the players, and, what happen
when they appear in out communities?

Video Games: a Public


Perspective.
by Jack Morton Publication date 1982

“Nature is all that a man brings with himself into the world; nurture is
every influence from without that affects him after his birth. …
Neither of the terms implies any theory; natural gifts may or may not
be hereditary; nurture does not especially consist of food, clothing,
education, or tradition, but it includes all these and similar influences
whether known or unknown.” (Galton, 1890) F. Galton
English Men of Science: Their Nature and Nurture
Appleton and Company, New York (1890)

026463 3
Alexander Pope's prophetic words, “The proper study of Mankind is
Man.”

ACKNOWLEDGEMENTS
This book is dedicated to:

JACKSON POLLOCK (1912-1956) - ARTIST

"Technic is the result of a need - new needs demand new technics - total control - denial of the accident states of order - organic
intensity - energy and motion made visible memories arrested in space - human needs and motives – acceptance."

JOHN BERGER (1926-PRESENT) - INTEPRETER OF ART

"The human imagination. . . has great difficulty in living strictly within the confines of a materialist practice or philosophy. It
dreams, like a dog in its basket, of hares in the open"

More immediately, Acorn plc, DBB BMP Needham, NOP Market Research, members of the
working group on user research, and of course the Cambridge Trialists, all of whom
contributed to, and made possible this book. To them may I add my academic supervisors - Dr.
Hamish Macleod and Prof. James Fleck. I am also in debt to Prof. Alfonso Molina who has
provided valuable insights and inspirations, and to all those who have, and will, study to fuse
technology, craft and the human potential. Last, but most certainly not least, family – Ae, Jane,
William – reference points.

026463 4
Eminent English economist David Ricardo first raised the machinery question in
1821, that is, the “opinion entertained by the labouring class, that the employment of
machinery is frequently detrimental to their interests”.
Automation anxieties continued to resurface in the 20th century, right along with
accelerating technology advances. In a 1930 essay, English economist John
Maynard Keynes wrote about the onset of “a new disease” which he
named technological unemployment, that is, “unemployment due to our discovery of
means of economising the use of labour outrunning the pace at which we can find
new uses for labour.” But each time those fears arose in the past, technology
innovations ended up creating more jobs than they destroyed, causing the majority of
economists to confidently wave away the machinery question.

There are several central models of social explanation: causal, rational-intentional, and interpretive. It
is sometimes held that causal explanations are inappropriate in social science because they presume a
form of determinism that is not found among social phenomena. Rational choice explanation is
sometimes construed as different in kind from causal explanation, and interpretive analysis is
sometimes viewed as in-consistent with both rational choice and causal accounts. It will be argued in
this part, however, that such views are mistaken. Causal analysis is legitimate in social science, but it
depends upon identifying social mechanisms that work through the actions of individuals. Social
causation therefore relies on facts about human agency, which both rational choice theory and
interpretive social science aim to identify.

HTTPS://WWW.NYU.EDU/CLASSES/JACKSON/
CAUSES.OF.GENDER.INEQUALITY/READINGS/
VARIETIESSOCIALEXPLANATIONINTRO.PDF
Rising incomes and consumption, especially in emerging economies.Previous
McKinsey research has estimated that over the next decade an additional one
billion people will have enough discretionary income to become consumers,
mostly from emerging economies. Such global consumption could grow by
$23 trillion between 2015 and 2030. This could add over 300 million new jobs
around the world.
Aging populations. The study estimates that by 2030 there will be over 300
million more people aged 65 and over than there were in 2014. Caring for
aging populations around the world could generate between 80 million and
130 million additional jobs by 2030.

026463 5
Development and deployment of technology. “Overall spending on technology
could increase by more than 50 percent between 2015 and 2030. About half
would be on information technology services, both in-house IT workers within
companies and external or outsourced tech consulting jobs… By 2030, we
estimate this trend could create 20 to 50 million jobs globally.” While these
numbers are relatively small, they are high-wage occupations including
computer scientists, engineers and IT administrators.
Investments in infrastructure and building. McKinsey estimates that the world
needs to invest an average of $3.3 trillion per year in infrastructure compared
to the current annual spending of $2.5 trillion. These investments could create
from 80 million to 200 million jobs.
Investments in renewable energy, energy efficiency, and climate
adaptation. Such investments could create between 10 and 20 million
additional jobs in a range of new occupations.
Marketization of previously unpaid domestic work. “About 75 percent of the
world’s total unpaid care is undertaken by women and amounts to as much as
$10 trillion of output per year, roughly equivalent to 13 percent of global
GDP.” Most of it is domestic work. Given the rising female labor force
participation, much of this currently unpaid work will require paying for
services, which could create between 50 million and 90 million additional jobs
around the world.

Abstract

1969 was an eventuful year. It was the year in which Rollo May the American existential psychologist
wrote in 1969, that: “depression is the inability to construct a future.” (May, 1969, p. 243). But the
future was being realised as in that same year we wittnessed men on the moon, the fulfilment of the
famous pledge made by U.S. president Kennedy at the start of the decade. But for Jim Johnson, then
serving in Vietnam: “It seemed strange - the great amount of human endeavor in technology to put
men on the moon and there I was in the middle of a war. Seemed like two opposite ends of human
achievement.” Other servicemen watched the landing on televisions in the lobby of Intercontinental
Hotel in Bangkok, Thailand.1 . While the unimaginable tewachnical achievement, the Sturn V had t
had more than 3 million parts, the United States was entrenched in other activities, namely a war
which they were not winning. A war which illustrated that technological supremeacy and the use of
computers could not guarantee a win. In his speech Kennedy placed television in a historical
timeframe where it sits with penecillin, nuclear power, steam engines, electric power, cars, airplanes,
the printing press, even Christainity viewed as exmaples of human constructs, examples of ingenuity,
courage, ambitions or innovation. “The growth of our science and education will be enriched by new

1
http://www.wherewereyou.com/frames/war.html

026463 6
knowledge of our universe and environment, by new techniques of learning and mapping and
observation, by new tools and computers for industry, medicine, the home as well as the school.” 2

In 1969 the first communications are sent through the ARPANET and the management authority
Peter Druker introduced the idea of a knowledge society (Druker, 1969) where he claimed that
knowledge substitutes labour, materials and capital as the main source of productivity, growth and
social inequalities. It was also a year after 2001: A Space Odyssey (1968) had broadcast the themes
and futures of space exploration, human and technical evolution, existentialism, and artificial
intelligence. The film was noted not only for the ragne of prophetic technologies it featured, and its
pathbreaking production and visual allure, but also for its lack of a cohesive narrative or story. In a
reteropsective Stanley Kubrick, the director and co-writer of the movie stated:

“ … each viewer brings his own emotions and perceptions to bear on the subject matter, a
certain degree of ambiguity is valuable, because it allows the audience to "fill in" the visual
experience themselves. In any case, once you're dealing on a nonverbal level, ambiguity is
unavoidable. But it's the ambiguity of all art, of a fine piece of music or a painting -- you
don't need written instructions by the composer or painter accompanying such works to
"explain" them. "Explaining" them contributes nothing but a superficial "cultural" value
which has no value except for critics and teachers who have to earn a living. Reactions to art
are always different because they are always deeply personal … In this sense, the film
becomes anything the viewer sees in it. If the film stirs the emotions and penetrates the
subconscious of the viewer, if it stimulates, however inchoately, his mythological and
religious yearnings and impulses, then it has succeeded.”3

And so it was in the application and use of computers in the Vietnam War:

“If one is to understand the value and potential use of those computer records generated
during the Vietnam War, one must first have some sense of their context - why and how they
were generated - as well as some understanding of the heated controversies concerning their
validity and use.” 4.

In 1969 The Moratorium to End the War in Vietnam was a massive demonstration and teach-in across
the United States against the United States involvement in the Vietnam War. It was alarming to
authorities as it came on the back of other protests and demonstations in Paris and even in the Soviet
held countries. These were anti-esyblishment and a reaction to pverty, injustice, and consumerism.
Some commentaros bring to the fore that this was the first generation to grow up with television in
their homes. Television had a profound effect on this generation in two ways. First, it gave them a
common perspective from which to view the world. Television coverage of the Vietnam War had a
huge impact on public opion in America. It was the first time people saw footage of combat in their
homes, not only combat but more importantly the casualties resulting from it. Seeing American dead
and wounded was a real shock, in prior wars such images were rarely released to the public.

Without the ability to imagine possibilities and project ourselves forward we would have no idea what
path we need to choose, which path to commit to. We cannot strategise. The present or when we have
time to dwell on it given the competition for attention today, seems in comparison boring, or at the
very least, a platform to be improved upon. Technology offers us hope, an individual's sense of
purpose and meaning in life can affect the shape and use of technology (Pacey, 2001). And as David
Nye pointed out, “all technological predictions and forecasts are in essence little narratives about the
future. They are not full-scale narratives of utopia, but they are usually presented as stories about a
better world to come.” they comprise what Anothnoy Giddens echoed in his risk socisiety: "a society
increasingly preoccupied with the future (and also with safety), which generates the notion of risk," 5

2
https://er.jsc.nasa.gov/seh/ricetalk.htm
3
http://www.visual-memory.co.uk/amk/doc/0069.html
4
Harrison, D.F. “Computers, Electronic Data, and the Vietnam War,” Achivaria, no. 26 (Summer
1988): 20.
5
Giddens, Anthony and Christopher Pierson (1998) Making Sense of Modernity: Conversations with
Anthony Giddens

026463 7
2001 was prescient in almost all its detailed predictions of twenty-first-century technology. For
instance, in August 2011, the Samsung electronics group began a defence against a claim of patent
infringement by Apple. Who invented the tablet computer? Apple claimed unique status for its iPad;
Samsung presented a frame from 2001.

Everyday life in its pursuit of a ‘better life’ is a managed, practical, as well as a ‘lived’ experience
punctuated by the fantastic and ‘other-wordly’ prospects offered by television and other media music,
reading a book etc.. This is true if we consider ourselves, say, thinking about walking the dog, or from
now on, buying organic vegatables and drinking fair trade coffee will help others, elsewhere, or that
‘having a smartphone will connect me to the world’. Each has win-win syumetraies. The pursuit of a
better life for all is also the goal of those who organise and provide for us, such as those who enact
laws, censorship, or craft devices and apllications, algorithims, wallpaper and menu choices, or
choose who to reward and punish. For the large part, all these multivarious actions are strategic and
future focussed and they constitute ‘being connected’. A version of the ‘Personal Computer’ was
offered in the New York Times, in November 3, 1962, entitled “Pocket Computer May Replace
Shopping List. Inventor Says Device Could Tell Grocery in Advance What Customer Needs.” John
W. Mauchly, who had been involved in the development of ENIAC, had developed a portable
computer and was now claiming he was working on a pocket variety. In the article pocket-size
computers may eliminate the housewife's weekly shopping list. Electronic communication would tell
the store in advance what she needed. She would simply pick up the bundles. Mauchly also predicted
the day when a headwaiter could accurately forecast the cocktail a person wanted merely by matching
the drinker's characteristics against preferences recorded in his own pocket computer.

What we think about is what we must do in order to achive goals and aims, big and small, major and
minor, short and long term. And so do they, those who own control, monitor and makes sense of the
data, and attmpt to interpet it in order to create new products, services, policies and other kinds of
interventions, thus the reflexive quality of the modern risk society of Beck and Gideens, the idea that
as a society examines itself, it in turn changes itself in the process. In some respects this is a macro-
version of the task-artifact cycle proposed by John Carrol (Carroll et al. 1991), idea that tasks and
artifacts coevolve in use and development. The task-artifact cycle is an iterative process of continuous,
mutually dependent development between task and artifact, a process that will never reach an
optimum state. Of course there reamins minor excursions such as when we sit and reflect on present
conditions and what led up to them, self-relflection and thought experiments which involves
repicrocation and empathy, and arguably these spaces become challenged as media, messages and
information from those who seek to manage and organise and ‘keep on track’ aim to plug the gaps in
conscious attnenion under the relentless rubic of customisation, personalisation, streamlining, on-
demand and just-in-time. Clearly, insitutions and beurocrasies and certain designs are aimed at
slowing down or eradicating certain affrodances and possibilities. Policies, lwas, regualtions,
technologies which are not ‘backward comptability’ or possees interoperability, or use propriety are
aimed at not only protecting interests and ownership but also open to accusations of stifling
innovation and development. The shift to open source and the sharing economy, towards student
maange learning, participatory practicesand so forth, is a move to shift owndeshp and responsibility
onto the individual and get them involved.

Downtime and gaps when all this paternal-like provision is turned off, become reduced, less and less
as people move from screen to screen leaving their valuble data trails, or ‘ambient intellgence’ recedes
any notion of ‘interface’ ever more into the background. It comes to foreground only when there are
transgressions to the plan or alogorithim. People come to accept their ‘meansured lives’, and the mode
and rate of acceptance then, in turn, feeds the system wehther this is further intervention, further
actions, innovation, or the switching off of the hearing aid. The danger in the end game of this is that
it is not digital, there is only one button, not two, and that is ‘I agree’ or ‘like’.

“The individual source of the statistics may easily be the weakest link. Harold Cox tells a
story of his life as a young man in India. He quoted some statistics to a Judge, an
Englishman, and a very good fellow. His friend said, Cox, when you are a bit older, you will
not quote Indian statistics with that assurance. The Government are very keen on amassing
statistics - they collect them, add them, raise them to the nth power, take the cube root and
prepare wonderful diagrams. But what you must never forget is that every one of those

026463 8
figures comes in the first instance from the chowty dar [chowkidar] (village watchman), who
just puts down what he damn pleases.” (Stamp, 1929; pp.258-259)6

But regardless of our lived experience the pronounced focus upon the future is what also drives
business and design, itself ever more grossly focussed on ‘everyday life’, its grandiose and groteseque
aggregated nature in the minds of politicans needing votes or marketers showing ‘impact’, what
everybody does and truly needs everyday, the bascis, clean water, food, shelter, etc. and its minuatae
and tiny slack, how individuals align or digress from these standards or norms. It all remains future
focussed as it understands its vulrnability in terms of how people react and respond to what it offers,
again over scale and statistically significant, and making alusions to how people’s tastes and habits
change, how morals and ethics dictate or slip, passing ephemeral beleifs and predilications, aesthetic
tastes jostle and adapt. It still means that we often fail to deliver practical and sustainable results as we
do not know the mechinisms of change. Technologies and other artefacts, what they do now, what
they do for us, how they are used, how they are imagined, how they are depicated, shown, revealed,
remains part of the “circuit of technology”: uses, artifacts, and social contexts which shape and
condition what can be done to improve them, and lay the empirical foundation for changes and
adaptations to the look, feel, features, and funcationalities – the outcomes which then feedback into
the process: “uses, where the instrumental use of the artifact is brought to the fore; technological,
where the form of media technology is focused upon; and social, where the structuring of outcomes by
social determinants is central.” (Dahlberg, 2004) 7 Big data, A.I. powered by it, and the internet are
often portrayed somewhat paradoxically as both integral to the development of contemporary society
and yet, they reamin somehow out of our reach, making us passive spectators but reinforcing the idea
of democracy, the notion of voice. And at the same time participatory notions of user and
coummunitiy involovement still resound as mantras which seem to underscore creadability to
decisions which have alredy been made by investors or funders. If something ‘doesn’t fit’ the data,
what is done is as it always was; it was called ‘fudging’, getting reality and its interpretation to fit the
numbers. The numerical manipulation, the material manipulation, the changing of interface elements,
icons or colours, the code, was always easier to manage and predict than a bunch of lay people having
their imaginations stirred and promoted, probed by a focus group moderator ‘which of these three
designs do you like best?’ ‘I’m OK with the phone I have, honesty’ ‘OK, imagine you had to choose,
which of these three designs would you like best?’

It is not that we succeed but rather that we fail, moreover, in predictable ways. The first is by
overrealiance upon projections and assumptions. This assumption is that tomorrow’s problems will be
the same as today’s; that problems are the one’s that we, as developers, desginers, planners and
business people face or can comprehend. They are not dynamic; they are not linked to biorhythms, or
off the cuff gut feelings. Problems in using an interface, its usability, are lack of understanding on
behalf of that user – other users are likely to be better informed. There was a lack of representation in
this sample or population, let’s do another one. “Thomas Edison, for example, reportedly listed
‘listening to music’ fairly far down on his list of uses to which the phonograph he invented would be
put. He thought recording the voices of dying relatives so that offspring could remember what they
sounded like was a more likely outcome.” (Scannars and Carvalho, 2004, p3). 8

That by assuming that, once invented, new technology will invariably be bought and used; and by
envisioning technological change is consistent, uniformly linear and gradual. We can easily map its
trajectories and interrogate this curve to look for windows and opportunities and threats. But as an
example challenging this rather linear manner of thinking has been the reality that there has been
nothing inevitable about the evolutionary path of personal computers since 1977, and later the internet
and digital networked multimedia computing in the 1990s. “The Internet was not created by a sheer
act of will … . technical change has a momentum which is often independent of those who appear to
control it. They [those who appear to stand behind the technology] are as often forced to cope with the

6
Stamp, J. (1929), Some Economic Factors in Modern Life, King
7
Lincoln Dahlberg; Internet Research Tracings: towards Non-Reductionist Methodology, Journal of
Computer-Mediated Communication, Volume 9, Issue 3, 1 April 2004, JCMC932,
https://doi.org/10.1111/j.1083-6101.2004.tb00289.x
8
Predicting the market evolution of computers: was the revolution really unforeseen Steven P.
Schnaars, Sergio Carvalho Technology in Society 26 (2004) 1–16

026463 9
many unanticipated consequences of technical change as they are able to plan that change.” (Street,
1997, p. 34)

“Not only does the artifact itself, as suggested by the task-artifact cycle, change the task it
was designed to address, but it also has to adapt to other changes in the context of use.
Human organisations are continually evolving in the face of changing circumstances and
wherever humans are involved, there is a constant negotiation and re-negogation of meaning,
work processes, identities, etc. This poses a problem to information systems as computers
and software can only respond to pre-defined situations or problems in a finite problem space
(Harris and Henderson 1999). Information systems therefore have an inherent disadvantage
when trying to keep up with the constant change of the unpredictable humans. Pre-defined,
rule-based behaviour is not sufficient, and therefore it is only natural that people continually
find their artifacts insufficient for their evolving work practices.”9

That year 48,000 personal computers were sold worldwide – computers not aimed necessarily with
glib notions of ‘productiveity’ but moreoever ‘curiosity’ and the start of a new media form in the
shape of games. That same year, however, Ken Olsen, the founder of Digital Equipment Corporation
DEC, one of the largest players in IT at that time, apparently told a convention of the World Future
Society: “There is no reason for any individual to have a computer in his home.” This statement
diffused quickly when Time magazine picked up on it. What was missing, however, was context. No
doubt the hobbyists and professional developers of PCs at the time aimed to take up the gauntlet of
such a statement, but lifting this sentence from Olsen’s talk was misleading. It did not convey that he
himself used a “computer in his home” his family was already playing games on it.

Dourish and Bell (2011) in their study on ubiquitous computing define technological myths as
powerful “organising visions” on how a new technology will fit in the world – that is the creation of
contexts. According to David Nye, the best way to market a new innovation is to “present an
innovation as not just desirable but inevitable.” (Nye, 2006, p.35.) 10 If the investors, senior
management, and consumers believe in the inevitability of the product, it can become a self-fulfilling
prophecy. There is reason to doubt that current claims will prove any more accurate, and charges of
"hype" are not hard to find (Oettinger, 1969; Roszak, 1986).

During the early years of the digital revolution, primarily in the 1950s and early 1960s, a large
segment of public opinion came to see the emergent computers as “intelligent brains, smarter than
people, unlimited, fast, mysterious, and frightening” (Martin 1993: 122). Martin’s contention, based
on a body of poll-based sociological evidence and content analysis of newspapers, is that mainstream
media journalists shaped the public imagination of early computers through misleading metaphors and
technical exaggerations. By contrast, according to Martin, computer scientists attempted to counteract
this narrative and to exaggerations about the new devices (129). As computers moved into the
workplace and into the daily lives of workers in the early 1970s, Martin notes that the myth of the
‘awesome computing machine’ lost part of its credibility, but still affected a large segment of public
beliefs. Nye sees that firms “create not only products but also compelling narratives about how these
new devices will fit into everyday life.” (Nye, op cit, p36) Inventors and investors whose livelihoods
depend on these products often “propose dramatic future scenarios in which their particular device
will become indispensable for the average person.”(p.211) In other words, articles about the future
also had a purpose of creating demand and share many qualities with advertising.

Most of us are cognisant of the distortion imposed by ‘Whig interpretations’ of the history of
innovations – the tendency to view development with the clear unambiguous vision of hindsight. This
provides a misleading impression of a linear progression from one great idea to the next, one great
man to the next, and obscures the paths of development that could have been but were never taken.
Science and technology studies (STS) as a body of research has soberly highlighted the often
convoluted and interwoven innovation paths taken before any kind of stabilised design product comes
to the market.

9
https://www.interaction-design.org/literature/book/the-glossary-of-human-computer-interaction/task-
artifact-cycle
10
Nye, David E. (2006), Technology Matters: Questions to Live With. Cambridge: The MIT Press

026463 10
And so it is with digital services, networks and technology. Beck (1992:50) 11defined modernization
as,"surges of technological rationalization and changes in work and organization, but beyond that
includes much more: the change in societal characteristics and normal biographies, changes in
lifestyle and forms of love, change in the structures of power and influence, in the forms of political
repression and participation, in views of reality and in the norms of knowledge. In social science's
understanding of modernity, the plough, the steam locomotive and the microchip are visible indicators
of a much deeper process, which comprises and reshapes the entire social structure."A general
purpose technology (GPT) has been defined as "a technology that initially has much scope for
improvement and eventually comes to be widely used, to have many uses, and to have many Hicksian
and technological complementarities" (Lipsey et al., 1998, p. 43) 12. Steam, electricity, and information
and communications technologies (ICT) are generally accepted as being among the most important
examples. Microprocessors and computer code may indeed be aspects of the general purpose device,
but what is input and what is output, and how it is input and output may vary considerably, just as
steam or electric powering factory machinery, or domestic cooking. Another aspect of GPTs are their
potential to affect the entire economic system and can lead to far-reaching changes in such social
factors as working hours and constraints on family life.

DEC at its peak the late 1980s was the number two computer company in the United States with sales
revenues of $14 billion. Digital faltered in the 1990s, however; in 1992 Olsen was replaced as CEO,
and in 1998 the company was sold to Compaq (which in turn was bought up by Hewlett-Packard in
2002). Part of the reason for Digital’s downfall is often blamed on Olsen’s failure to anticipate or
understand the burgeoning personal computer market, a failure supposedly exemplified by his having
disparaged the PC as something no individual needed to have in his home.Olsen goes down in
porsterty with the likes of William Preece who is aluuded to have said before a Select Committee
hearing in 1879 and with regard to the future of the telephone in Britain: “But there are conditions in
America which necessitate the use of instruments of this kind more there than here. Here we have a
superabundance of messengers, errand boys, and things of that kind. In America they are wanted.” 13
The law has often been illustrated using the example of fax machines: a single fax machine is useless,
but the value of every fax machine increases with the total number of fax machines in the network,
because the total number of people with whom each user may send and receive documents increases.
[5] Likewise, in social networks, the greater number of users with the service, the more valuable the
service becomes to the community. "Almost all of the many predictions now being made about 1996
hinge on the Internet's continuing exponential growth. But I predict the Internet will soon go
spectacularly supernova and in 1996 catastrophically collapse."

Thus launched The Internet and its contents in the early to mid-1990s. Firms across industry sectors
and wider organisational and regulatory bodies struggled to make sense of, and profit from what was
unfolding and how they could apply profitably it to their sphere of operation.

The development of IT to the realm of domestic services and entertainment has all the knots, dirt and
irregularities of the organically grown carrot. Perhaps it was this that drove the epochal rhetoric of
commentators and publicity departments, some of it familiar from previous incursions information
technology had made into homes and lives, resurrected under the rallying cry of ‘convergence’, the
buzz word of the mid-1990s. Analysts predicted the computer and the television set would soon merge
into one box which would lend new kinds of useful functionality. In order to enable this would not
only incur changes to domestic routine, but perhaps moreover radical changes in organisation and
strategy for a plethora of industry sectors, not least the advertising, ordering and the delivery of goods
and services. to However, as for most users a dial-up service meant bandwidth which was incapable of
conveying video in any satisfactory manner, and it was difficult to imagine a family gathered round
the computer at night as it took centre stage in the living room and they fought for control of the
mouse.

11
Ulrich Beck (1992). Risk Society, Towards a New Modernity. London: Sage Publications. p. 260.
12
Lipsey, R. G., Bekar, C. and Carlaw, K. (1998a), "What Requires Explanation ?". In (E. Helpman,
ed.), General Purpose Technologies and Economic Growth. Cambridge, Mass.: MIT Press, pp. 15-54.
13
https://geekin9f.wordpress.com/2017/02/15/william-preece-on-messenger-boys-vs-telephones-
worst-technological-prediction-of-all-time/

026463 11
The goal of a digital video on demand (VOD) service was conveying video with a quality
synonymous with magnetic tape video players, but delivered by client hardware distributed over the
local cable TV infrastructure. The added value was to be potentially unlimited variety, more than was
available in the local video hire shop. The technology and system was being realised by a division of
the Cambridge, UK based Acorn Computer PLC. Online Media (Om)’s major responsibility was
developing a television set top box, basically a Acorn Risc PC motherboard but with video, sound,
networking and peripherals such as a remote control which would make it targeted for domestic use.

They were also central in the building of two distinctive but essential consortia, one for the technical
networking and the other for content and service delivery. The original intention was to offer a more
professional, more polished, less ‘contentious’ range of networked digital services and video options
including home shopping, banking, and online games. These services would be finite and ‘less
bewildering’, safe and 'family-friendly' when compared to the ‘wild west’ feel of the internet. In order
to prove the technology and concept a technology and marketing trial was also organised by Om. This
would prove the technology, and later it would also monitor and establish its commercial propensity.

Making television digital and interactive was not in itself a novel nor unique endeavour, there already
had been at least two decades of innovation which manifested in a distinctive range of devices and
services, and add-ons which allowed manipulation of onscreen data. Most obviously, these included
early games consoles, video cassette players and recorders, early PCs and view data and videotext.
The history of VHS is also intertwined with the boom of video rent shops and new modes of viewing.
the recording of films and TV shows straight of the telly. Before this, home ownership of movies just
didn’t exist.

There had also been concerted efforts from the days of early PCs to stress through its marketing, the
‘educational’ and ‘entertainment’ propensities of home computing. Advances in digital multimedia
technology and authoring saw many companies across the world trialling new products based upon
emerging digital technology and networks. Many solutions proliferated. These included video-on-
demand, and less sophisticated 'TV internet boxes' and other kinds of solution which used teletext and
videotext. The solutions were high-tech and low-tech, some interfacing the television with the open
system of the internet whilst others aiming to create ‘walled-gardens’ of selected and curated services
and content.

Issues, complications and opportunities arising from the approach and implementation of the
Cambridge trial unfolded over an eventful and, at times, turbulent three year period. Monitoring this
project over its entire duration provided rare insight into not only the innovation and development
process, growth of a rapidly emerging and evolving digital industry sector, but also showcased the
wider issues of how designers come under the auspices a wide range of shaping influences beyond the
material selection and application of components. The study attempts to highlight why certain paths or
directions were taken at the expense of others, often as a compromise or as opportunist reaction to
external offers. Issues arising from these developments affected technology and the work needed to
improve and innovate. The context and the organisation of the trial, not least in the way in which it
unfolded, continually challenged my own research approaches. It was constantly a moving target,
highly dynamic, highly charged, with Om growing in numbers as marketing teams assembled and new
expertise was brought on board. Indeed new forms of hybrid expertise were sought out such as
‘graphic designers with HCI knowledge’. The dynamic nature of the growth of the organisation, and
the twists and turns regarding their strategic orientation, as well as the shifting nature of interfaces and
technology, drove the present study from a relatively straightforward and focused attempt to explore
the usability – or ease of use – of the new interactive television system and its physical and graphical
interfaces and services to something much larger. What became also apparent was the very nature of
the phenomenon we call ‘television’ itself. In its shear familiarity, immediacy, centrality, the way it
affords us ‘presence’ and absorbs our attention, and provides for our worldview also suggested a
much broader look at what was actually happening, both in design, in management of design, and in
the experience of design. The study was then pushed and nudged to extend to a much broader and
thoughtful exploration of how ease of use as a practice [design] and experience [use] situates within
the exigencies, constraints of system building and prototyping. Also how it scaled in terms of social
analysis, for instance, how does this set of activities influenced and become influenced by other events

026463 12
and happenings within the wider organisation? And finally, I began to ask questions regarding how
this is affected by, and yet also affects, the external organisation, the wider partnerships needed to
make it happen, and this includes industry and market trends, cultural issues, the technology and
service partners. As such it was an attempt to understanding the deeper meaning of the technology, its
usability, the sociology of design, and the experiential and organisational aspects of media production
and reception through its designers and users. Designers are of course users too, and users can be
designers, this was most apparent in the first stage of the trial where those who took the technology
home were the people actually developing it.

My framework of use, applied to every development stage of the technology and by everyone who
would come into contact with it. The framework came to be described as the contexts of use or
contextual usability (CU), an overarching framework aimed at framing the relationship of the
phenomenology of use and how habitual usage patterns and tendencies take shape in various actors
who were involved. Various senior managers and designers were interviewed within the company
designing and producing the i-Tv technology and interface, as was the members of content firms who
participated in the technology and marketing trial. 11 participant households were also interviewed in
an attempt to explore the experience of the technology polyvocally, capturing perspectives of design
and production, and consumption and use. The study concludes with an overview suggesting the rich
interconnection and interdependent nature of trials, technology, user-consumer research, knowledge
production, design, media, designers and organisation. I draw upon Alfonso Molina's notion of
Sociotechnical Constituencies to illustrate how the wider and more disparate social, cultural and
organisational elements of trials can both rely and at the same impinge upon, the implementation and
interpretation of user and consumer research. It thus maps operational perceptions of the user and the
use process that guides, or fails to guide, technology and service development.

026463 13
Rediffusion developed a "world first" when in 1973 it demonstrated a TV signal
sent over 1 mile by means of a Fibre Optic Cable running in parallel with the
main vision trunk route at Hastings in Kent. The demonstration was a success
and comparison could be made between a signal sent over copper or fibre
optics.

026463 14
CAMBRIDGE, England - February 10, 1997

Curtis Mathes Uniview Products First to Incorporate Acorn's State-of-the-


Art Solutions

The Acorn Computer Group plc (London Stock Exchange:GB0000061167)


today disclosed details of its cutting edge TV-centric technology, which has
been licensed by Dallas-based Curtis Mathes Holding Corporation
(NASDAQ:CRTM) for integration into the company's forthcoming line of
uniView (tm) interactive television products. Acorn's technology has been
chosen by market leaders like Curtis Mathes for its low-cost and high-
functionality - desirable attributes for a consumer device. Consumers will
soon see the benefits of such technology partnerships through quick and
selective access to information on the Internet through television, and by
being able to choose which services may be accessed through a particular
device.

Using Acorn flashDisplay (tm) technology, the Curtis Mathes uniView


products will be able to predict what information a user is likely to need
and periodically save it into FLASH ROM, providing instantaneous access
to that information. For example, a week's worth of television programming
guides or appointments from a personal organiser would be instantly
available when the user requests it.

"Anyone who has ever used the Internet knows how frustrating and time
consuming it can be to wait for information to download," said Tom
O'Mara, Director of Sales for Curtis Mathes. "By integrating Acorn
flashDisplay (tm) technology into our uniView products we will be able to
save users the hassle of entering URLs, waiting for the modem to get
through, and downloading the files. They will have the information they
want, when they want it." (more) With the ever increasing size of data and
long waits on the Internet, mechanisms for accessing regularly used
information will become increasingly important. This same Acorn
flashDisplay (tm) technology can be incorporated into next-generation
interactive hardware or PCs to download such things as software updates,
browser plug-ins, or even video clips. Practical applications of this
technology are widespread. For example, a company may want to pay for
their advertisement to be downloaded so that it is seen by the user every
time they restarts the service.

026463 15
Utilising Acorn technology, Curtis Mathes uniView products will offer
another feature particularly appealing to parents with young children -
parental block. Using this service, parents will be able to prevent their
children from accessing inappropriate television programming as well as
use the interactive database to help find specific programs based on criteria
such as program genre, specific actor, character or rating. All of this is done
with support from Curtis Mathes' Internet Service Provider, Xpressway
(tm).

For a UK-based project called Acorn InterTalk 2 (tm), Acorn introduced


the world's first Internet product capable of securing information," said
Peter Bondar, Divisional Director of Acorn. "This was developed so that
teachers were able to determine what information their students could
access. The project was extremely well-received, and it made sense for us
to evolve this technology for use in such devices as the uniView television
products."

Curtis Mathes and other leading consumer and entertainment-based


companies have chosen to incorporate complete television and Internet
solutions based on a range of suitable Acorn technologies:

Advanced RISC Machines (ARM) microprocessor: Originally developed


by Acorn, the ARM architecture is low-cost, requires low power
consumption, and is small in size, meaning that high functionality products
can be brought to market at the prices consumers are used to paying for
consumer electronics, rather than at PC prices. (more) Acorn RISC OS:
Acorn RISC OS is designed to run from ROM, making it ideally suited to
devices that don't require a hard-disk, such as Internet appliances and set-
top boxes. By eliminating the need for local storage and minimising the
amount of DRAM needed, products are kept low cost.

Acorn TV Centric Technology: Acorn's technologies allow high quality


text and images to be displayed on television based interlaced displays. The
technology is highly suitable for use with NTSC/PAL configured television
sets through the use of a unique hardware and software solution which
provides anti-aliased fonts for high definition text, anti-twitter software for
stable text and images, and software programmable video resolution, color
depth and scan rates.

About Acorn
The Acorn Computer Group is currently one of the only companies in the
world who can supply everything a customer needs - from hardware to
custom applications. Because Acorn's technology is scaleable, it can be

026463 16
used across a range of devices, and can be used as "building blocks" for
customised solutions. The company has experience in everything from chip
design to production processes making it the ideal partner for anyone who
wants to get to the market quickly.

Acorn Computer Group plc, head-quartered in Cambridge, UK, is one of


the world's leading developers and suppliers of innovative technology
designs and consultancy. Acorn is at the core of some of the world's best
digital technology products.

About Curtis Mathes:


Curtis Mathes uniView set-top products plug into any standard TV and
provide consumers with a host of useful features, including Internet access,
TV program listing, VCR programming, parental controls, telephony,
email, fax and caller ID, among others. Initial set-top versions will retail for
less than $400; uniView devices will also be integrated into a line of Curtis
Mathes premier television sets.

Dallas-based Curtis Mathes is an electronics engineering, design and


marketing company. The company also specialises in advanced televisions,
large electronic display systems, as well as other new technologies.

Acorn is a trademark of Acorn Computer Group plc. All other brand names
mentioned are trademarks of their respective holders, and are hereby
acknowledged.

Freeman writes. He dates the first factory to 1721: a silk mill, built in
Derby, England. In the decade after it opened, Daniel Defoe paid it a
visit, declaring it “a Curiosity of a very extraordinary Nature.”

Cambridge Corners the Future in


Networking
Volume 05, No.10, Nov 95

Tony Krzyzewski, Managing Director of Kaon Technologies, has


recently returned from a trip to Britain where he examined the
Cambridge Cable project, an undertaking he describes as the
most staggering advance in networking yet. He talks to TUANZ TOPICS about the
project and his involvement with it.

What is your connection with Online Media and the Cambridge Cable Project?

026463 17
Cambridge Corners the Future in Networking Volume 05, No.10, Nov 95

I'm consulting to Online Media here in New Zealand. I provide them with advice on
how to implement systems in New Zealand and as a direct result of that we have just
been appointed distributors for ATM Limited, who are the people providing the
backbone cabling for the Cambridge Cable project.

The catalyst for the trip to the United Kingdom to look at the Cambridge project was
some speaking engagements in Asia: October's interactive video conference in
Singapore and December's LAN/WAN conference in Hong Kong.

What is Cambridge Cable?

It's probably the most staggering development in networking I've ever seen.

They are taking networking into the home and delivering video on demand services
(such as watching the news when you come in and want to watch it), home shopping,
education on line, software on line, very soon the World Wide Web on line but doing
all that over a very large ATM based network to the home. There is a massive ICL
video server sitting in the test centre in Cambridge, a set of switches and from there it
distributes out to the home currently at 2Mb/s but then going to 25Mb/s shortly.

WE DON'T HAVE THOSE VISIONARIES IN THE

TELEVISION BROADCAST ENVIRONMENT YET, BUT WE'LL GET THERE.

The project is being run by various organisations. Online Media is part of the Olivetti
Group and is very closely associated with Acorn Computers. ATM Limited is
supplying the hardware, ICL's in there, Anglia Television provided the broadcast
facility and Cambridge Cable is the cable provider. After that initial kernel and trying
it out on about 10 stations run over six months they've taken it out to 100 stations in
the homes and 150 sitting in test labs. A number of other organisations have joined in
- Westminster Bank, the BBC and the local education authority. It's being used as
a nursery' and companies are putting in tens of thousands of pounds to learn how this
sort of technology is going to impact on the home.

Why Cambridge?

Cambridge is a breeding ground for networking developments. While I was over


there I visited the ATM development labs at Cambridge University. It is a
technology

breeding centre just like Stanford

and Palo Alto in the United States - I treat them as equivalent environments.

026463 18
You describe it as the most interesting development in ATM networking and yet it
happened in Britain, when may we expect this kind of evolution from the United
States where all the talk comes from?

It doesn't surprise me at all. This is a very common factor and I've seen this in
technology development for years; the Americans tend to talk about developments
and marketing the developments whereas UK organisations go ahead and say right
we've got an idea, let's build the hardware to do it and see if it really works.' It's just
that peculiarly English way of taking the gamble and doing it. There's not another
trial like it in the world. The Americans are talking about interactive services and are
st ill stumbling along with six or seven station pilots. This is full scale ATM to the
home development.

The Cambridge Cable Project is a co-operative venture?

Very much so; it's a learning experience. Everyone realises this technology is going
to have an impact. How soon it's going to impact nobody really knows. Companies
have to ensure what effect it will have on business. Consider an advertising agency -
how is a consumer's ability to switch and choose the programmes they watch at what
time they want to watch it going to effect how advertising works?.

This is trialing human interaction and how people will relate to the technology as
much as anything. You don't want to produce a computer like menu to work in the
lounge; people will reject it straight away. So the feedback is very interesting.

How will the pilot be turned into a commercial operation?

I think it will be very easy and that's what the pilot is all about; finding out how
much it will cost to run such a system, closely monitoring what people watch and
how they use the system. I think it will be very easy to extrapolate the true
commercial viability of such a project out to a nationwide venture.

So why is this project so exciting for you?

I've been doing nothing but data networking now for eight years and to actually see
someone taking faster networking technology than the average New Zealand
commercial organisation and applying that to something as simple as an ordinary
television type environment...and the ability to drive that out.

Some of the things they were doing at the ATM research labs was just mind
boggling. They had to drag me kicking and screaming out of the place. The total
simplification of this sort of technology is just staggering.

ATM has had its critics. Some people say it's not the technology we want but the
technology we deserve to get. What's your opinion?

ATM is going to be the most important development in data networking we've seen
to date because it is the convergence of video, voice, data, monitoring, telephony on

026463 19
a single network. They have been separate environments and nobody ever designed
anything that was going to inter-relate. Now we've got all these people working on
the same technology (ATM) for the first time. That has to be important.

It's very interesting studying the development of networking. I was one of those
responsible for introducing ethernet to New Zealand and seeing what happened there,
we've become very complacent about data networking. Right now our networks are
fairly simple and we're not really using them to the greatest advantage, and people
say things can't get any better. It's like PCs - we think we've reached a pinnacle of
technology. Well we haven't - we're still scratching round in the dirt.

ATM will bring such dramatic changes to the way we data network. Quite often
people think that ATM is just a high speed data network and imagine 55Mb/s big
bandwidth scenarios. We're introducing 25Mb/s ATM at $14,000 hub costs this
month. But ATM will also be introduced at very low speeds such as 2Mb/s which is
home distribution speeds and we'll start seeing ATM being used as home deployment
networks. It's an enabling technology.

IT'S LIKE PCS - WE THINK WE'VE REACHED A PINNACLE OF


TECHNOLOGY. WELL WE HAVEN'T - WE'RE STILL SCRATCHING ROUND
IN THE DIRT.

The biggest restriction in networking today is bandwidth out of the building. We still
think of local area network' and wide area network' - big bandwidth inside our
buildings and very low bandwidth outside our buildings. Eventually what will
happen is that there will be no differentiation - you will be on the network.

And if The Internet goes to ATM, all of these very primitive services such as the
World Wide Web will become on demand video services at cost effective prices.
What we pay today for a 2Mb/s link eventually that's what we'll pay for 155Mb/s
link.

How will what is happening with the Cambridge Cable Project effect New Zealand?

New Zealand is in a very good situation to develop this sort of thing ourselves
because we've never had the big cable television infrastructure in place for years as
in the United States and Britain. Unfortunately most of the cable companies seem to
be thinking along the same lines as the conventional American system - how many
hundred channels can be offered rather than installing digital interactive systems. We
don't have those visionaries in the television broadcast environment yet, but we'll get
ther e.

What else interested you in Cambridge?

The Olivetti ATM research laboratory at Cambridge University. They developed the
underlying technology that is running the Cambridge Cable project and they're
already on the next generation of networking. They've got a massive ATM network
deployed with 200 plus stations and have been running it for a year and a half. They

026463 20
have little tracker badges and when they walk into a room their video telephony
follows them, sound follows them, the computer network follows them and brings
their workstations to them ...when they go to a new office all they do is hit the badge
and teleport. Their whole work environment follows them.

There are video cameras located all around the room which are directly linked to the
data network - there is no intervening computer; there are microphones all around the
room again connected into the network and there are video screens which are simply
wall screens - and then there is the computer. The user in the office just says I want
this camera to give my output to the screen three offices away or I want my sound to
come from the office next door' and away you go in a video conference.

The head of the development lab has an ATM network in his home with an ATM link back to his
office and an ATM doorbell. You hit the doorbell, it triggers an ATM signal to put the video camera
on the visitor and he can see who's at the front door from his office. It was jaw dropping stuff.

Author: Anna Wallis


Feature Writer
TUANZ TOPICS
The Acorn Computer Group, the last British
company to have developed its own desktop
computer and operating system, is no more. The
company has been bought by US investment bank
Morgan Stanley and has been broken up to release
its £300m shareholding in hugely successful spin-
off, chip developer ARM.
Loss-making Acorn could not cash in the shares
without incurring a massive tax bill. Morgan
Stanley is using its purchase as a valuable tax loss,
swapping Acorn investors' shares for ARM shares.
Set-top-box maker Pace Micro Technology
acquired Acorn's thin client computing and
traditional personal computer business, including
around 30 staff, for just £200,000. Acorn closed its
desktop computer division last Autumn, apparently
for good.
But its small yet vociferously loyal and enthusiastic
community of users, dealers and developers
eventually persuaded it to maintain a lifeline for the

026463 21
Acorn computer platform, so its operating system
and graphical user interface, RISC OS, has been
licensed to a non-profit third-party company called
RISC OS Ltd, set up by dealers and developers.
Pace now controls this licence.
At least four companies are currently designing
next-generation hardware for RISC OS
applications.
Acorn produced the BBC Micro, on which many
first-generation UK users learned computing. And,
back in 1983, it designed the ARM RISC processor,
used by Acorn since 1987. In 1990, the ARM design
team was spun out of Acorn to form ARM Ltd and
the latter has since dwarfed its parent as Acorn
fought a losing battle against Microsoft-based PCs
in its primary market - schools.
Last year ARM was floated on the stock market and
at current prices is worth more than £1bn.

Acorn dies but


legacy lives on
By Etelka Clark computeractive.co.uk 
By Etelka Clark
 News
 Laptop
  01/07/1999

Fifteen years after its demise, Cambridge UK technology pioneer Acorn


Computers remains the most influential business in the innovation
cluster’s history, fresh analysis has revealed.

Laid to rest in 1998, Acorn is still generating billions in revenues through its
spin-outs, leading the world with legacy technology innovation and making
millionaires of its major ‘heirs.’

026463 22
Information told to Business Weekly by technology entrepreneurs Hermann
Hauser and Stan Boland shows that Acorn spinout Element14 is dominating
one market segment perhaps like no other player on the planet.

“While ARM’s market cap and performance to outmuscle Intel are prodigious
make it the obvious standout company in Cambridge, an argument could be
made for Element14 as the main Acorn spinout. People forget how influential
Element 14 was – its technology powers 90 per cent of broadband to home
content in the entire world.”

Boland, former CEO at Element14 and now in a similar role with Cambridge
wireless startup Neul, confirmed: “Yes Element14 was significant, its
FirePath is the core of BRCM's DSL technology and does power therefore
approximately 90 per cent of home broadband.

“FirePath is a well-tuned wide DSP which can also process scalar


instructions. Sophie Wilson originally conceived of it as a Long ARM - 128
bits of execution width but with DSP Instruction Set suitable for multimedia.

“She christened it ALARM but we renamed it FirePath. It got repurposed and


made minimal by tight engagement between her, John Redford and Simon
Knowles. It also got better at DSL-type processing through ISA modification
during its design.”

Boland, who was brought in fairly late in the piece to maximise core value
from Acorn’s splintering empire, adds another jewel to the crown – Acorn lit
the touchpaper to the current set top box explosion.

He said: “We also sold Acorn's set top box business to Pace. I’m not sure
they did that much with it, but it was way ahead of its time and IP set top
boxes are probably one of the hottest categories today. Add ARM’s superb
performance to the Acorn case file and you get a fairer measure of how
influential Acorn has been in the continuing development of the cluster.”

ARM has gone from $1.54 billion market cap in mid-January 2009 to a high
of $24.16bn on October 21, 2013. It now has 2,800 employees, is influential
in every continent and in a vast range of devices and outships Intel with
regularity.

But there is even more ammunition in the Acorn armoury. Dr Hauser reveals
that Acorn supplied technology that helped kickstart the career of a genius –
Stephen Hawking.

Acorn supplied Professor Hawking with his first voice synthesiser, which was
built into a BBC Micro. “That was in 1985-86 – the computer was incredibly
flexible but no doubt Steve took the voice synthesiser element and adapted
it.

026463 23
“And, of course, the BBC Micro is credited with inspiring the arrival and
sensational global success of Cambridge’s new microcomputer company,
Raspberry Pi. On top of all of this, research by the university’s Centre for
Entrepreneurial Learning at Judge Business School shows that more than
100 successful companies can trace their origin back to Acorn.

“Acorn, Cambridge Consultants and the University are the three main drivers of the
Cambridge Phenomenon. And I would say that in terms of businesses there is no doubt that
Acorn and Cambridge Consultants are the most influential in Cambridge’s history.” 15
November, 2013 - 14:31 By Tony Quested Business weekly

Acorn legacy still earning billions

Acorn, Cambridge Consultants and Business Weekly are all in a list of the
125 most influential Cambridge businesses of all time published by the
Cambridge News to mark its own 125th anniversary. The full list can be
accessed on the Cambridge News website .

Visit to Acorn Online Media, Cambridge

Wednesday 22 May 1996

Present:

Lord Craig of Radley


Lord Flowers
Lord Phillips of Ellesmere
(Chairman)

The Sub-Committee visited Acorn Online Media, St Matthew's Primary


School and Netherhall School in Cambridge.

ACORN ONLINE MEDIA

Introduction

1. Acorn Online Media (AOM) is a subsidiary of the Acorn Computer


Group, which is 45 per cent owned by Olivetti. AOM was established in
1994 to develop products and services for the networked multimedia
market. Presentations were given by Simon Wyatt (Director and General
Manager of AOM) and Nigel Harper (a senior consultant at AOM).

026463 24
2. Mr Wyatt said that the "killer application" for the information society, in
so far as there was one, was interactivity. AOM's core business was in the
production of intelligent set top boxes for interactive TV applications,
along with associated systems software, network services and consultancy.
A key aim of AOM's activities was to keep costs to the consumer as low as
possible by developing computing and interactive TV systems which used
the televisions people already had in the home. To achieve this, an
important development was graphics software that produced good quality
images using the particular qualities of a TV screen.

3. The AOM set top box could be regarded as an inexpensive computer


with a minimum of Random Access Memory (RAM), an operating system
that used Read Only Memory, and a high performance low power
processing chip which did not require a cooling fan. RAM chips were still
an expensive component of the set top box and AOM were attempting to
achieve the maximum range of functions while restricting the RAM
requirements to 4MB. Mr Wyatt said that "programmers have become lazy"
and AOM had had to convince them to write less RAMintensive versions of
word processing and spread sheet software for use with the set top box and
the Network Computer.

4. Most of the functions of the set top box could be operated using a simple
hand held device similar to a TV remote control, although a keyboard,
mouse, printer and other typical computer accessories could also be added
to increase functionality and extend the potential service usage.

5. AOM had been working with companies such as Digital and various
standards organisations to ensure that the set top boxes would meet current
industry software standards. Content providers had agreed on about three or
four standards for digital movie play back etc. and the AOM set top box
was designed to recognise all of them. The set top boxes also had the ability
to down load "applets" (small software applications that preceded the main
content and allowed the computer to decode and display what followed)
and hold them in the RAM memory while the content was being accessed.

6. Mr Wyatt predicted that at some point in the future when digital


television broadcasting, digital televisions, and digital programmes had
become established, then the AOM set top box might become an integral
part of the TV set.

Interactive TV and demonstration

026463 25
7. The Cambridge Interactive TV Trial (iTV) began in September 1994
with 10 "tame" users and a couple of content providers, using the already
well established Cambridge Cable network. The trial was later expanded to
100 real customers and a number of schools. The Cambridge Cable network
was upgraded in the trial areas to extend fibre optic cable as far as the
streetside boxes. Coaxial cable was then used in the local loop to individual
homes. The upgraded network infrastructure with Asynchronous Transfer
Mode (ATM) protocols used throughout now provided full twoway broad
band capabilities.

8. The trial was set up as a partnership with each contributing company


providing hardware, skills, software, or content as appropriate within their
fields of expertise. After establishing the network infrastructure, the main
activity was the development of services. A "service nursery" was set up as
a safe learning environment for the companies involved to learn from each
others mistakes and to develop common themes for their content. Some of
the companies taking part were the Post Office, Anglia Television, IPC
Magazines, Tesco, National Westminster Bank, and the BBC. The
Independent Television Commission and National Opinion Polls had also
been active contributors. Mr Harper indicated that AOM's interest in
providing training and help for the content producers was not just altruistic.
It did not take long for the hype over the new technology to disappear and
for people to start demanding content. Interactive TV and consumer
interaction with services was effectively at the mercy of the content
providers. Consumer interaction and the success of AOM was dependent
upon content being up to date, interesting and worth watching.

9. The iTV demonstration featured: online booking of cinema tickets


(including choosing one's seat and watching a film preview); personally
focused news, where the user identified areas of interest and these items
were regularly trawled from the various press agencies; indexing of
broadcast material so that the user could jump to specific sections of a
programme; and fast access to Ceefax. Apparently the fast Ceefax service
was one of the most popular: AOM regularly downloaded and stored all of
the broadcast pages of Ceefax on to an online cache which users could then
access rather than having to wait for each Ceefax page to be broadcast.
Banking, shopping, regular television channels, educational material and
other cable services were also available.

Network Computers

10. The network computer concept was an alternative to the traditional PC


and, like the AOM set top box, was designed with simplicity, ease of use

026463 26
and low cost in mind. The main computing power and data storage
requirements were held on a local server and this was accessed over a
network using a simplified Network Computer as the interface device. The
Oracle Corporation was one of the main proponents of network computing
and Acorn Network Computing had recently signed an agreement with
Oracle to develop and provide a reference design for a network computer
which Oracle would then licence to manufacturers.

11. Because software and stored information were not tied to a home
computer, they were available anywhere that a network connection could
be made. This was done using a smart card placed in the Network
Computer (or street kiosk etc.) to identify the user and provide access to
their files. The local servers for this required about the same computing
power as modern Internet servers and the computing power may be
optimised by software that learned the habits of its users (e.g. by down
loading and caching the football results on a Saturday afternoon).

ST MATTHEW'S PRIMARY SCHOOL

12. Carole Macintosh (the Deputy Head of St Matthew's School) and a


group of her children demonstrated how they were using computers and the
iTV educational material in the school. Some of the educational software
provided for the trial was now out of date although they were able to
download current material and software from the Internet for use on
traditional PCs. Ms Macintosh said that the iTV and other computing
materials were being used mainly as an additional resource to books etc.
rather than as a complete teaching medium.

13. The iTV system had advantages over the other computers in that a
whole class or group of children could sit around the screen and interact
with the programmes. The children themselves seemed completely at ease
with the iTV technology and were happy to explore and interact with the
features on offer through the system using a remote control hand set.

14. In addition to the iTV machine the school had a range of secondhand
PCs (donated by a parent) and software (e.g. drawing packages) which the
children were encouraged to use during their free time. It was apparent that
a considerable amount of time and effort, by both teachers and enthusiastic
parents, had been put into obtaining and setting up the computers, screening
software and making the computers childsurvivable. One example of
parental activity was screening games down loaded from the Internet to
weedout those with violent scenes. However, the range of suitable material

026463 27
available (real content and not just "froth"), via the iTV system in
particular, could still be much better.

15. The school had conducted a survey of the proportion of pupils with
access to computers at home. This was 70 per cent, although the
"computers" ranged from games consoles to fully functional home PCs.
Children without home access were given extra coaching in keyboard skills
at school.

NETHERHALL SCHOOL

Introduction

16. The Headmaster, Dr Hunter, provided an introduction to the school and


its long history of being at the forefront of using computers and testing
educational software. This was followed by demonstrations of the
technology by Alastair Wells (the Head of Information Technology), Mr
Driscoll (from Design and Technology Online) and a number of the ALevel
students.

17. The School had 1495 pupils and 85 staff, split over two sites, which
were linked by their computer use and physically by a fibre optic
connection. The school is a superhighway evaluation centre, and had
piloted the Design and Technology National Curriculum material for the
Department of Education. A major new school building, which would
include ATM technology, was under construction.

Use in the school

18. The school had an internal computer network and five labs linked
together with a broad band connection. However, the amount of actual
computer use was said to be quite small because of the physical and
personnel limits involved: the cost of the computer equipment was still a
major drawback, but the main need for additional funding was to provide
an extra member of staff to develop the use of IT still further. The school
was, however, trying to encourage the use of computing technology across
all aspects of school life, and to all ages, by promoting the purchase of
individual Acorn pocket book computers (based on the Psion pocket book).
These were being made available to students through a nought per cent
financing scheme. Loan computers would be made available to students
who could not afford to buy them.

19. The school network used ATM protocols and they had a number of
large storage devices for networked computer use, including a 480

026463 28
Gigabyte hard drive for video storage. In terms of practical use the iTV
system allowed users to gather data for school projects and collate video
material which, for example, could be used either in the classroom by the
teacher as a learning resource or by the student at home for preparing
homework assignments. It was said that teachers were naturally experts at
multimedia authoring (they do it already when preparing teaching material
and so using the iTV system was an obvious extension to their skills).

20. The school was producing its own multimedia material for distribution
on the iTV network and much of the authoring was being done by the
students. One of the successful experiments so far had been to video a
geology field trip and then place it on the network so that students could
review the trip, compare views with their field sketches and catchup on any
aspects that they might have missed.

Demonstration

21. One of the benefits of a network system that linked to other schools was
that items of expensive equipment could effectively be shared
electronically between sites. A student demonstrated how it was possible to
send the designs for an injection mould over the network to a
computercontrolled milling machine, and then watch the process in
operation via a videofeed back over the network link. In the demonstration
the machines were in adjacent rooms, but any distance could be irrelevant if
suitable network connections existed.

22. The students themselves were very confident about using the network
technology and, in particular, seemed to enjoy the development work
associated with multimedia authoring. Most of the students already had
access to computers at home although few had home access to the Internet.

23. It was said that one teacher had overheard the following in a local pub:
"I'm going to buy my son the Internet for Christmas". Clearly this
demonstrated that the public awareness of the Internet was high, but that
this was not yet matched by a clear understanding of what it was.

24. The school was clearly a centre of IT excellence, with equipment to


rival that in some university departments and an exceptionally enthusiastic
and dedicated Head of IT. The school acknowledged that it was fortunate in
being in Cambridge and had benefitted by having significant links to the
local high technology industries. The case for centres of excellence to act as
test beds for other, less fortunate, schools was strongly argued.

026463 29
Contents

PART ONE - CONCEPTUAL AND THEMATIC ISSUES................................10


CHAPTER 1 – INTERACTION AND TECHNOLOGY.................................................................11
BOOK AND CHAPTER INTRODUCTION.................................................................................................13
THE STRUCTURE OF THIS BOOK..........................................................................................................16
THE VALUE AND SCOPE OF THIS STUDY.............................................................................................19
TELEVISION AND THE NATURE OF THE 'DOMESTIC' AND THE 'EVERYDAY'.........................................20
CONTEXTUALISING, EMBEDDING AND NATURALISING.......................................................................25
TELEVISION.........................................................................................................................................26
INTERACTION......................................................................................................................................30
CULTURES AND SPACES OF PRODUCTION AND CULTURES AND SPACES OF USE.................................34
INNOVATION.......................................................................................................................................37
KNOWLEDGE OF ANTECEDENTS..........................................................................................................41

026463 30
PRE-HISTORY OF INTERACTIVE TELEVISION.......................................................................................42
THE TECHNOLOGY OF INTERACTIVE TELEVISION...............................................................................45
DIGITAL INTERACTION........................................................................................................................48
THE TELEVISION AUDIENCE - FROM 'VIEWERS' TO 'CONSUMER-USERS'..............................................52
CHAPTER DISCUSSION.........................................................................................................................55
CONCLUSION.......................................................................................................................................61
CHAPTER 2 – THE SOCIAL AND THE TECHNICAL................................................................68
INTRODUCTION...................................................................................................................................69
GENERAL SYSTEMS THEORY AND HOLISM..........................................................................................75
CYBERNETICS, COMPLEXITY THEORY AND CHAOS............................................................................77
THE SOCIAL AND THE TECHNICAL......................................................................................................79
SOCIAL SYSTEMS................................................................................................................................80
RESEARCHING COMPLEXITY...............................................................................................................85
TECHNIQUES OF UNDERSTANDING AND MAPPING SOCIOTECHNICAL SYSTEMS..................................87
APPROACHES, FRAMEWORKS, TOOLS.................................................................................................89
ACTOR-NETWORK THEORY.................................................................................................................93
SOCIOTECHNICAL CONSTITUENCIES....................................................................................................98
THE CONSTITUENCIES OF THE CAMBRIDGE TRIAL...........................................................................106
CHAPTER DISCUSSION.......................................................................................................................108
CONCLUSION.....................................................................................................................................117
CHAPTER 3 – THE CONTEXTS OF USE....................................................................................121
INTRODUCTION.................................................................................................................................122
WHY FOCUS ON USERS?....................................................................................................................123
DESIGN PRESUMPTION......................................................................................................................126
INTERPRETATION AND HERMENEUTICS.............................................................................................132
PRE-UNDERSTANDING.......................................................................................................................135
TEXT.................................................................................................................................................143
USES AND GRATIFICATIONS..............................................................................................................146
CONTEXT..........................................................................................................................................152
USABILITY........................................................................................................................................156
FAILURE TO CAPTURE USERS............................................................................................................159
DESIGN OF EXPERIENCE, INNOVATION IN USE AND CONSUMPTION..................................................163
DRAWING THE COMPONENTS OF THE USE PROCESS TOGETHER........................................................166
USE...................................................................................................................................................169
USABILITY........................................................................................................................................172
USAGE..............................................................................................................................................172
USEFULNESS.....................................................................................................................................174
CHAPTER DISCUSSION.......................................................................................................................174
CONCLUSION.....................................................................................................................................178
CHAPTER 4 – ON METHOD..........................................................................................................181
INTRODUCTION.................................................................................................................................182
TO COLLAPSE 'PUSH' WITH 'PULL'.....................................................................................................182
ANTICIPATING USE............................................................................................................................184
ETHNOGRAPHIC TURN.......................................................................................................................187
INTERPRETATIVE PARADIGM............................................................................................................188
ETHNOGRAPHY.................................................................................................................................191
ETHNOGRAPHY IN HCI RESEARCH...................................................................................................196
RESEARCH WITHIN THE CAMBRIDGE TRIAL.....................................................................................199
SYSTEM-LOGGING.............................................................................................................................200
CONSENSUAL APPROACHES TO RESEARCH.......................................................................................206
CHAPTER DISCUSSION.......................................................................................................................207
CONCLUSION.....................................................................................................................................217

PART TWO - CASE STUDY AND CONCLUSION............................................219

026463 31
CHAPTER 5 – ACORN AND THEIR TECHNOLOGY...............................................................220
INTRODUCTION.................................................................................................................................221
THE CAMBRIDGE PHENOMENON: THE ENVIRONMENT OF ACORN...................................................223
ACORN..............................................................................................................................................224
THE RISC PC AND THE RISC OS....................................................................................................225
THE RISC PC - HAD IT A MARKET?.................................................................................................227
THE ORGANISATION OF ACORN........................................................................................................229
ACORN ONLINE MEDIA -THE GENESIS OF THE CAMBRIDGE TRIAL.................................................231
1994..................................................................................................................................................233
RISC PC AND THE HARDWARE OF THE STB....................................................................................235
EVOLUTION OF STBS: CONFIGURATIONS OF TECHNOLOGY, EXPERTISE AND LUCK?.......................235
THE DEVELOPMENT OF THE INTERFACE...........................................................................................238
1995..................................................................................................................................................243
THE DEVELOPMENT OF DEMOS.........................................................................................................249
CONCLUSION.....................................................................................................................................250
CHAPTER 6 – THE CAMBRIDGE TRIAL AND SERVICE NURSERY..................................258
INTRODUCTION.................................................................................................................................259
THE CAMBRIDGE TRIAL - THE TECHNOLOGY...................................................................................259
CITVIC AND THE TECHNOLOGY OF THE TRIAL................................................................................261
CAMBRIDGE I-TV TRIAL SERVICES..................................................................................................263
THE DEVELOPMENT AND THINKING BEHIND THE SERVICE NURSERY..............................................265
BUSINESS CASE FOR CONTENT AND SERVICES................................................................................267
IMPRESSIONS OF DEMO.....................................................................................................................269
INTERFACE........................................................................................................................................270
THE CONTENT AND SERVICES...........................................................................................................272
THE SERVICE NURSERY....................................................................................................................274
USER RESEARCH...............................................................................................................................278
THE SERVICES OFFERED ON THE TRIAL............................................................................................279
MAKING THE TRIAL AUTONOMOUS.................................................................................................281
CHAPTER DISCUSSION.......................................................................................................................284
CONCLUSION.....................................................................................................................................288
CHAPTER 7 – ACCESS...................................................................................................................291
INTRODUCTION.................................................................................................................................292
MINIMISING THE CONSTRAINTS OF PRECONCEPTIONS......................................................................293
FIRST CONTACT................................................................................................................................296
SECOND CONTACT............................................................................................................................299
OUTCOME OF PILOTS........................................................................................................................299
BREAKDOWN OF INITIAL ASSUMPTIONS...........................................................................................303
THE ANTICIPATED TRIAL AND THE ACTUAL TRIAL..........................................................................304
2ND VISIT TO OM...............................................................................................................................306
3RD VISIT TO OM..............................................................................................................................307
REQUEST FOR A STAND-ALONE UNIT................................................................................................308
PHASE.1 USER RESEARCH MEETING................................................................................................309
PILOT TEST WITH SET TOP BOX: FEB. - MARCH 1995.....................................................................310
4TH VISIT TO OM: ERIC DONALDSON................................................................................................311
BMP DDB........................................................................................................................................311
FIRST SERVICE NURSERY MANAGEMENT GROUP MEETING...............................................................312
FIRST USER/MARKETING RESEARCH WORKING GROUP MEETING......................................................313
CHANGING ORIENTATION OF THE USER RESEARCH..........................................................................314
SCIENCE MUSEUM............................................................................................................................315
THE AUTONOMY OF THE CAMBRIDGE TRIAL...................................................................................317
SECOND USER/MARKETING RESEARCH WORKING GROUP MEETING..................................................318
NOP QUALITATIVE...........................................................................................................................321
GROUP COMMUNICATIONS................................................................................................................322
2ND USER RESEARCH MEETING..........................................................................................................326

026463 32
THE SAMPLE.....................................................................................................................................332
THE USER RESEARCH........................................................................................................................335
CONCLUSION.....................................................................................................................................337
CHAPTER 8 – CONCLUSION........................................................................................................343
INTRODUCTION.................................................................................................................................327
ENLARGING USABILITY AS A RESEARCH PROJECT............................................................................330
SITUATING USABILITY AS COMPANY PRACTICE................................................................................331
RECONTEXTUALISING.......................................................................................................................334
IMPACT OF DISCOURSE ON TECHNOLOGY DEVELOPMENT AND PERCEPTIONS..................................337
USER INNOVATION............................................................................................................................339
MECHANISTIC MODELS OF PEOPLE...................................................................................................348
PEOPLE'S LIVES, LIFESTYLES, LANGUAGE AND SYSTEM-LOGGING...................................................350
SUMMARY.........................................................................................................................................356
DRAWING THE SALIENT POINTS TOGETHER......................................................................................361
CULTURES OF USE.............................................................................................................................363
CULTURES OF PRODUCTION..............................................................................................................366
RAISING THE PROFILE OF THE INDIVIDUAL IN SOCIOTECHNICAL CONSTITUENCIES..........................366
THE TRIALISTS..................................................................................................................................378
APPROACH TO USERS........................................................................................................................381
CHAPTER DISCUSSION.......................................................................................................................386
LESSONS FOR FIRMS..........................................................................................................................395
FURTHER WORK, ISSUES, AND IDEAS................................................................................................397
REFERENCES.....................................................................................................................................400
APPENDIX 1- THE CAMBRIDGE TRIAL PARTICIPANTS........................................................................431
APPENDIX 2. TWO TRIALS.................................................................................................................451
appendix 3. sys-logging...................................................................................................................454

Figures
FIG. 1.1 SET OF DUALITIES INDICATING THE EMOTIONAL/PSYCHOLOGICAL TENSIONS AND BOUNDARIES
BETWEEN HOME AND THE 'REST OF THE WORLD' (AFTER DOVEY, 1978).......................................21
FIG. 1.4 A BASIC SCHEMATIC MAP OF THE CAMBRIDGE TRIAL (FROM OM LITERATURE)......................47
FIG. 1.5 THE POSSIBLE VARIATIONS OF MEDIA CASTING........................................................................50
FIG. 2.1 DU GAY ET AL'S "CIRCUIT OF CULTURE" (AFTER DU GAY ET AL., 1997).................................91
FIG. 2.2 SOCIOTECHNICAL CONSTITUENCY WITH TECHNOLOGY AT THE CENTRE................................102
FIG 2.3 MESHING OF CONSTITUENCIES AS THEY DEVELOP, CREATING 'RIPPLES' OF CAUSE AND EFFECTS,
INSPIRATION AND SHAPING...........................................................................................................103
FIG 2.4 HOW AN EPISODE OF INTERACTION WITH THE SYSTEM HAS 'KNOCK-ON' EFFECTS IN 'FRONT OF
'(IN TERMS OF AN INDIVIDUAL'S SOCIAL LIFE, AS WELL AS 'BEHIND' (IN TERMS OF CREATING
WORK, CHANGING INVENTORIES) THE SCREEN.............................................................................110
FIG. 1.2 LEONARDO DA VINCI'S 'PROPORTIONAL STUDY OF MAN IN THE MANNER OF VITRUVIUS' IN
KEMP (1981: P.115)......................................................................................................................127
FIG1.3 ROBERT FLUDD'S (1617) 'MACROCOSM AND MICROCOSM' IN UTRIUSQUE COSMI HISTORIA
(REPRINTED IN GODWIN, 1979: P.60)...........................................................................................128
FIG.3.1 THE ELEMENTS OF THE USE PROCESS (ARROWS SUGGEST INTERDEPENDENCY OF USE
ELEMENTS, ARROWS CIRCUMNAVIGATING SUGGEST PROCEDURE OF USE, FROM FIRST USE TO
ESTABLISHMENT OF USE. USABILITY OF THE TECHNOLOGY RESISTS USE)....................................168
FIG. 5.1 THE ORGANISATION OF ACORN CIRCA. 1995 (COURTESY OF THE ACORN COMPUTER GROUP
PLC)...............................................................................................................................................230
FIG. 5.2 THE SOCIOTECHNICAL CONSTITUENCY AT THE GENESIS OF A PROJECT..................................251
FIG. 6.1 THE MEMBER FIRMS OF CITVIC (AS OF 1995) - CAMBRIDGE INTERACTIVE TELEVISION
INFRASTRUCTURE CONSORTIUM (INNER CIRCLE) & THE SERVICE NURSERY (OUTER CIRCLE)
(PICTURE COURTESY OF ACORN OM)...........................................................................................265
FIG. 6.2 THE SERVICE NURSERY: THE PSPS ARE ON THE OUTER RING, WITH THE MANAGEMENT BOARD
CENTRAL SURROUNDED BY THE WORKING GROUPS.....................................................................283
FIG. 7.1 CHART OUTLINING HOW THE ROBUSTNESS OF TECHNOLOGY, THE 'SOPHISTICATION' OF

026463 33
CONTENT, THE HETEROGENEITY OF TRIALISTS, WAS ANTICIPATED TO EVOLVE OVER SUCCESSIVE
TRIAL PHASES...............................................................................................................................305
FIG 8.1 RESEARCH AND ITS INTERPRETATION AS PROVIDER AND FILTER OF KNOWLEDGE OF THE USER-
CONSUMER....................................................................................................................................337
FIG. 8.2 A CIRCULAR MODEL OF USER KNOWLEDGE PROPAGATION, COMMUNICATION AND
IMPLEMENTATION.........................................................................................................................343
FIG. 8.3 SUMMARY OF INFLUENCES ON PRODUCT DESIGN AND DEVELOPMENT ("T" REPRESENTS A
CENTRAL FOCUS UPON TECHNOLOGY) (AFTER MOLINA, 1987)...................................................358
FIG. 8.4 THE INFLUENCES ON THE DOMESTICATION PROCESS ("H" PLACES THE CENTRAL FOCUS ON
THE HOME, OR RATHER A GIVEN INDIVIDUAL'S PERCEPTION OF THE HOME)................................365
FIG. 8.5 THE SOCIOTECHNICAL CONSTITUENCY OF DOMESTICATING TECHNOLOGY............................369
FIG. 8.6 A HUMAN CENTRED SOCIOTECHNICAL CONSTITUENCY ("I" IS FOR THE INDIVIDUAL, THEIR
PERCEPTIONS, APPREHENSIONS ETC.)...........................................................................................370

Tables
TABLE 1.2 COMBINATIONS OF INFORMATION TRAFFIC PATTERNS (AFTER BORDEWIJK AND KAAM
1986)..............................................................................................................................................48
TABLE 1.1 ANTHROPOMETRIC DATA ADAPTED FROM MIRA (MOTOR INDUSTRY RESEARCH
ASSOCIATION) REPORTS AND THEY ARE REPRESENTATIVE OF BRITISH CAR DRIVERS................130
TABLE. 3.1 CAFFS FROM THE USER-CONSUMER'S AND THE FIRM'S PERCEPTIONS..............................142
TABLE 3.2 THE USE PROCESS...............................................................................................................168
TABLE 4.1 SYS-LOG AND ITS DATA......................................................................................................201
TABLE 4.2 COMPONENTS AND ATTRIBUTES OF CU..............................................................................214
TABLE 8.1 DATA, INFORMATION AND KNOWLEDGE REGARDING THE USE PROCESS............................341
TABLE 8.2 CONSUMER INFORMATION USEFUL ACROSS THE DIFFERENT PHASES OF THE PRODUCT
DEVELOPMENT PROCESS, AND THE RESEARCH METHODS THAT MAY HELP TO ACQUIRE IT.........359
TABLE 8.3 SHOWING GENERAL RESEARCH QUESTIONS ARISING FROM THE INTERACTION OF USE,
USABILITY, USAGE AND USEFULNESS...........................................................................................361
TABLE 8.4 THE ELEMENTS LINKING PRODUCTION, DESIGN, AND PRODUCT WITH CONSUMERS, USERS,
AND THEIR LIFESTYLES.................................................................................................................374

026463 34
PART ONE
CONCEPTUAL AND THEMATIC ISSUES

Diegesis (/ˌdaɪəˈdʒiːsɪs/; from the Greek διήγησις from διηγεῖσθαι, "to narrate") is a
style of fiction storytelling that presents an interior view of a world in which:

1. details about the world itself and the experiences of its characters are revealed
explicitly through narrative
2. the story is told or recounted, as opposed to shown or enacted.

mimesis (Greek μίμησις "imitation") have been contrasted since Plato's and Aristotle's
times. Mimesis shows rather than tells, by means of action that is enacted. Diegesis is
the telling of the story by a narrator. The narrator may speak as a particular character or
may be the invisible narrator or even the all-knowing narrator who speaks from "outside"
in the form of commenting on the action or the characters.

026463 35
Autor et al. defined the tasks involved in each of
approximately 450 occupations in the Dictionary of
Occupational Titles (DOT). The tasks considered were
classified according to one of five types: non-routine
cognitive/analytic, non-routine cognitive/interactive, routine
cognitive, routine motor (manual) and non-routine motor.
Each occupation was given a score for each of the task
measures. The resulting scores were consistent with
expectations. Thus, aggregating the occupations to the 1-
digit level, the highest-scoring task amongst managers was
non-routine cognitive/interactive, amongst professionals was
non-routine cognitive/analytic. For technician/associate
professional, administrative workers and machine
operators/assemblers, the dominant task was routine
cognitive and routine motor skills. Non-routine motor skills
were most strongly observed amongst protective service
occupations, and ‘Handlers, equipment cleaners, helpers and
laborers’. Autor, D., Katz, L. and Kearney, M. (2006). The
polarization of the U.S. labor market. American Economic
Review, 96(2), 189-194.

026463 36
Preface

In 5th century B.C. Greece, Antiphon the Sophist, in a fragment preserved from his chief
work On Truth, held that: "Time is not a reality (hypostasis), but a concept (noêma) or a
measure (metron)." Aristotle, in Book IV of his Physica defined time as 'number of
movement in respect of the before and after'. Time appears to have a direction – the past lies
behind, fixed and immutable, while the future lies ahead and is not necessarily fixed.
Aristotle believes that some things are not in time; that the things that are in time are all and
only those things that last for a finite length of time. This raises a puzzle. Everlasting things
are (on this view) not in time. Things that last forever are not in time. Time does not cause
them to grow older or decay. In order to measure time, it is necessary to find an
appropriate unit. Aristotle thinks that the appropriate unit is a certain change: the
revolution of the outermost sphere of the heavens. Aristotle’s claim that time
measures change, and is measured by change. 14

Augustine ends up calling time a "distention" of the mind (Confessions 11.26) by which we
simultaneously grasp the past in memory, the present by attention, and the future by
expectation. Kant thought of time as a fundamental part of an abstract conceptual framework,
together with space and number, within which we sequence events, quantify their duration,
and compare the motions of objects. Time was designated by Kant as the purest possible
schema of a pure concept or category. designated by Kant as the purest possible schema of a
pure concept or category. Henri Bergson believed that time was neither a real homogeneous
medium nor a mental construct, but possesses what he referred to as Duration. Duration, in
Bergson's view, was creativity and memory as an essential component of reality. According
to Martin Heidegger we do not exist inside time, we are time. Hence, the relationship to the
past is a present awareness of having been, which allows the past to exist in the present. The
relationship to the future is the state of anticipating a potential possibility, task, or
engagement. It is related to the human propensity for caring and being concerned, which
causes "being ahead of oneself" when thinking of a pending occurrence. Therefore, this
concern for a potential occurrence also allows the future to exist in the present. The present
becomes an experience, which is qualitative instead of quantitative. Heidegger seems to think
this is the way that a linear relationship with time, or temporal existence, is broken or
transcended.[66] We are not stuck in sequential time. We are able to remember the past and
project into the future – we have a kind of random access to our representation of temporal
existence; we can, in our thoughts, step out of (ecstasis) sequential time.

Activists do not ask, ‘why act’?, but rather ‘when’?, ‘where’? and ‘how’? These dedicated
and courageous individuals are important partners in the quest for a better, fairer and safer
world. Secretary-General Kofi Annan

"The human operator served as information transmitter and processing device interposed
between his machine's displays and . . . controls" (Lachman et al., 1979, p. 7).

“It is the recipient who communicates. Unless there is someone who hears, there is no
communication. There is only noise. One can perceive only what one is capable of
perceiving. One can communicate only in the recipients’ language or in their terms. And the
terms have to be experience-based. We perceive, as a rule, what we expect to perceive. We
see largely what we expect to see, and we hear largely what we expect to hear. The
unexpected is usually not received at all. Communication always makes demands. It always

14?
Time for Aristotle: Physics IV. 10-14
Ursula Coope (2005)

026463 37
demands that the recipient become somebody, do something, believe something. It always
appeals to motivation. If it goes against her aspirations, her values, her motivations, it is
likely not to be received at all or, at best, to be resisted.” (Druker, 1974, p.483)15

“All old-established national industries have been destroyed or are daily being destroyed.
They are dislodged by new industries, whose introduction becomes a life and death question
for all civilized nations, by industries that no longer work up indigenous raw material but raw
material drawn from the remotest zones; industries whose products are consumed, not only at
home, but in every quarter of the globe. In place of the old wants, satisfied by the production
of the country, we find new wants, requiring for their satisfaction the products of distant
lands and climes. In place of the old local and national seclusion and self-sufficiency, we
have intercourse in every direction, universal interdependence of nations.” 16

"A home is not simply a building; it is the shelter around the intimacy of a life.
Coming in from the outside world and its rasp of force and usage, you relax and allow
yourself to be who you are.
The inner walls of a home are threaded with the textures of one's soul, a subtle weave of
presences...
Where love has lived, a house still holds the warmth.
Even the poorest home feels like a nest if love and tenderness dwell there."
~ John O'Donohue

One of my favorite memories of my childhood was my father helping me set up an aquarium.


In retrospect, I understand that he was teaching me to think about a community of organisms
and their interactions, interdependence, and the issue of keeping them in balance so that it
would be a healthy community. That was just at the beginning of our looking at the natural
world in terms of ecology and balance. Rather than itemizing what was there, I was learning
to look at the relationships and not just separate things.

Homemaking is creating an environment in which learning is possible.

~ Mary Catherine Bateson


Weber's" Calvinism and Capitalism ,. hypothesis-a discussion of the linkages between
Puritanism and the rise of the scientific attitude.

A human brain with its large pre-frontal cortex is regarded as the only brain in the
animal kingdom with the ability to make an abstract sense of reality in all its rich
complexity.17 The opposite could also be said, ‘a human brain with its large pre-
15
Drucker, P. F. (1974). Management: Tasks, responsibilities, practices. New York: Harper & Row.
16
Marx and Engels, The Communist Manifesto as quoted in Bell, D. (1999) The coming of post-
industrial society: a venture in social forecasting New York: Basic Books p. xxviii
17
We begin, perhaps appropriately for a science and technology studies (STS) proponent with a
contentious issue. Bottlenose dolphins have bigger brains than humans (1600 grams versus 1300
grams), but their brain-to-body-weight ratio – encephalisation - is less but still larger than the great
apes. They also possess gangly neurons called Von Economo neurons, which in humans and apes
have been linked to emotions, social cognition, and even theory of mind - the ability to sense what
others are thinking. captive in zoos and aquariums massive dolphin culling We don't know, even in
humans, what is the relationship between brain structure and function, let alone intelligence. Also
there is contention regarding the definition of complexity. Currently, there is no precise definition of
complex systems. A recent article by a philosopher and a mathematician tried to answer the question
“what is a complex system?” [29]. After reviewing and analyzing several definitions previously
established by scientists working in complexity science, they listed several properties of complex
systems: nonlinearity; feedback; spontaneous order; robustness and lack of central control;
emergence; hierarchical organization; and numerosity. These properties are not necessary for a system
to be qualified as a complex system, nor sufficient, but they are key features that differentiate complex

026463 38
frontal cortex is regarded as the only brain in the animal kingdom with the ability to
make reality overly complex and pin labels on everything.’ This allows for the
emergence, recognition, criticism and making of things, amongst infinite examples is
complex markings and symbols, machines, pictographs on cave walls or on
blueprints and YouTube animations, whether your home is tidy, and sewerage and
eco- systems. Advanced planning and decision-making, strategy and tactics, humour
appreciation of mortality, adaptation to unsuitable environments (e.g. deserts and
frozen lands), morality, enhanced connections between neurones, non-personal
comprehension. It’s not all good, as we are the only creature that shows vulnerability
to neuropsychiatric disease probably because we make reality complex. We are
‘wired’ to seek out and recognise patterns, and we are wired to interpret what they
mean now and in the future. We also recognise patters where none exist and some
people also see ghosts.

The original single-celled organisms bacteria and archaea that were our common
ancestor may not have had brains, but they did have sophisticated ways of moving
over land whilst simultaneously sensing and responding to their environment,
moving to where it was most beneficial and ‘comfortable’ for them, say in terms of
temperature conditions, humidity, or where nutrition was easily found [why do I in
my mind’s eye think of those little construction-kit robots in a workshop, or bacteria
in the corner of the shower cubicle when I describe this?]. The mechanisms of sense
are intact right through from to the evolution of mammals. Traits are favoured which
were beneficial increase the organism's probability of surviving and reproducing,
whilst being heritable so to be passed on to an organism's offspring. (Schaffne
and Sabeti, 2008) 18 Evolutionary processes not only shaped brains, but importantly
they also shaped bodies which, reflexively, and fed into the evolution of brain and
the faculties of sense-making and bodily control, as well as other noticeable features
such as protective coloration, thumbs, to the ability to utilize a new food source, to a
change in size or shape that might be useful in a particular environment. There is the
multimodality of social action, evident in the coordinated use of gesture, speech,

systems from other simple systems. In complexity (1992) Mitchell Waldrop’s defined it as the domain
between linearly determined order and indeterminate chaos.
18
Schaffner, S. & Sabeti, P. (2008) Evolutionary adaptation in the human lineage. Nature
Education 1(1):14

026463 39
body orientation, and gaze. Television nature programmes, or just a walk in the park
and switching onto what is there, showcase the almost inconceivable diversity of
living things each with their own evolutionary histories, making them just as they
are.

The story of our development individually and as a species is that comes through
interaction with dynamic, shifting and changing environments, including human
social and cultural circumstances (for instance, Dunbar and Shultz, 2007; Henrich;
2016)19.

Our bodies and minds derive ethologically as a result of incomprehensible amount of


interactions. Some of these are between biological responses and abilities to adapt
over the short (development and other faculties such as acclimatisation) and longer
terms (genes and evolutionary processes). Others refer our abilities to learn,
individually over the course of our entire lives, and as a culture intergenerationally
over the ages. Evolution has meant interacting with both natural and human
fabricated and manufactured environments. These were shaped and fashioned not
only by naturally occurring phenomena, but by emergence (unintended), human
intention (design) and the deliberate biological adaptations of other living things.
With the dawn of agriculture some 10,000 years ago, humans embarked on a new
programme rather than adapting to our environment, we began adapting our
environment to meet our needs, slashing and burning forests to create room for
agriculture. Once we could grow things more efficiently, more conveniently closer to
where we lived, we began to disassociate with chance, and began to live more
predictably, we had more leisure time, we developed larger societies which begat a
freer exchange of information. People today live in a world marked by material
structures, physical, chemical and biological phenomena, such as roads, cities,
architecture, and symbols and representations, road signs, various media, pollution,
the clothes people choose to wear, the food they choose to eat, and their behaviours
and actions from throwing out the waste, to making plastics, and drilling for oil, and
19
R.I.M Dunbar, Susanne Shultz (2007) Understanding primate brain evolution Phil. Trans. R. Soc. B
2007 362 649-658; 29 April 2007

026463 40
mining rare earths.20 They also live in McLuhan’s ‘global village’ with potentially,
having at their immediate disposal the ability to contact almost everyone else on
earth.

We have dispensed with the idea that new-borns of all mammalian species enter the
world tabla rasa, rather we now hold that they come already equipped with essential
senses, reflexes and learning mechanisms which suit and equip them for further
development. These constrain, promote and modulate successful and unsuccessful
subsequent interactions. They also do not come into the world alone, in the early
stages they may need a little help from parents and others, and so a baby pays special
attention to faces, and is quick to identify the face of its mother. Its mouth reflexively
closes round the teat to feed and so forth. Success and failure to do things becomes
marked for the baby, and emotions signal through faces and sounds, feelings of want,
need, joy, attentiveness and neglect become performed and expressed. Babies prefer
the sound of voices to non-speech sounds, and they can distinguish one basic speech
sound from another. Language, repeated utterances, start to be picked out,
recognised, indexed, as they become understood within the context of other
particular sorts of activity - what the philosopher Ludwig Wittgenstein termed a
language game. Language about building makes sense in the context of the activity
of building. Baby talk and playing with hanging mobiles and other toys start to affect
preferences and indexes, and as babies engage in such interactions as they exhibit a
patterning instinct of the body and brain that leads to the learning of language and
other symbolic collateral – ‘a’ is linked to the noise ‘ahhh’, which is linked to the
printed letter, the word ‘apple’ and so forth in additive synthesis fashion. What is
interesting is at more macro social level, we can consider cultures as emergent sets of
patterns that are formed from the interactions between people (Hendrich, 2016).
These patterns can be observed, made sense of by laymen and social scientists alike,
but they cannot truly be ‘reverse engineered’ even by the most attentive and intuitive

20
http://science.time.com/2013/12/20/rare-earths-are-too-rare/ From our smartphones to our latest
weaponry, the technology that underpins modern life would be impossible without rare earth metals.
The importance of rare earths has only grown as emerging markets increase their demand for
technologies made with it, as does the renewable energy industry. Now a new study from researchers
at Yale has found that many of the materials used in high-tech products, including rare earth metals,
have no satisfactory substitutes the techniques used to extract and refine them is labor-
intensive, environmentally hazardous and increasingly costly

026463 41
anthropologist or system admin logging, watching and listening. They are limited to
representation; data needs its story to be meaningful.

These patterns derive in a similar manner as did evolutionary changes or the


development of the individual – only over different time spans. There need to change
the interactions between people if you want to change the patterns. This means
interventions. A culture is emergent and is the result of millions of interactions,
behaviours, artifacts and stories that people build up over time (Henrich, 2011). That
cultural evolution was likely a dominant force driving our species’ genetic evolution
over the last few hundred thousand years. Through its autocatalytic processes
(Chudek & Henrich, 2010), ever accumulating cultural elements may have driven our
brain expansion, cognitive specializations (Herrmann, Call, Hernandez-Lloreda,
Hare, & Tomasello, 2007), social psychology (Henrich & Henrich, 2007) 21 It is
unpredictable and results in surprise, what Henrich terms intergenerational ‘creativity
without intention’. Culture includes stories, but it is not a story. This is important
because simply changing the story of the organization will not change the culture.
Instead there must be the creation of ways for people to interact differently and see
what comes of it (Facebook opposed to email?). Cultural evolution is not predictable
and cannot be led to a pre-determined character. You can aspire all you want to a
particular future culture but it is impossible to script or predict that evolution it is
‘simply’ too complex.

Nevertheless, the prefrontal cortex of the human brain, and/or the environment,
invites us to join a relentless drive to find meaningful patterns in a cornucopia of
sense data, as well as construct categories and explanations and feelings for all that
we apprehend. In all that external sense data that we encounter, we construct, posit,
observe, feel, touch, and hear. That which we focus upon, that which we discover,
that which choose to disregard, starts to coalesce and becomes organised, and yet
such structures still remain flexible held within this complexity in our brains and
21
Henrich, Joseph (2016) The Secrets of Our Success: How culture is driving human evolution,
domesticating our species, and making us smarter. Oxford: Princeton University Press.

Invention through Bricolage: Epistemic... (PDF Download Available). Available from:


https://www.researchgate.net/publication/323497998_Invention_through_Bricolage_Epistemic_Engin
eering_in_Scientific_Communities [accessed May 11 2018].

026463 42
minds. We learn about that which seems more stable and permanent in the
environment, we learn about that which seems to occur in cycles or repeat,
appearance and absence. We form habits, patterned behaviours around familiar
objects that perform familiar tasks. Only that which we do not recognise, fail to
notice, are ignorant of, understand or are able to interpret is somehow missed. “It
takes a while for us to learn to solve problems, and then it takes even longer for us to
realize what we don’t know that we would need to know to solve a particular
problem.”22 This is the practical object of science and engineering, to address that
which we do not know.23 – either how something ‘fits’ [other pre-existing substances
or machinery or codes, systems, processes etc.] or having a go to define just what
something ‘is’ [whatever lends it characteristics, features, affordances, etc.]. This is
the realm of invention and innovation.

It all becomes dependent upon, or part of, an overall gestalt which is the evolving,
conscious mind, that amorphous term we call experience. Unlike what was the
dominant views in a reductionist science, we cannot experience or make sense of
things presented to us isolated, frozen, atomised, and on their own. 24
Dependent
variables are always privy to the interaction effects between independent variables in
the real world.25 Quantitative accounts of complex and contingent causal processes in
the social world, often attempt to apply:

“… measurements are taken on an extensive range of variables as they co-


vary in the real world, as opposed to in experiments where simple
abstractions from the world serve as the basis for the production of ‘laws’…
22
https://www.edge.org/conversation/mary_catherine_bateson-how-to-be-a-systems-thinker
23
Stanford (2001, 2006) argues that the most serious challenge to scientific realism is posed by what
he calls the problem of ‘unconceived alternatives’. He sees that the historical record of scientific
inquiry itself, I suggested, offers abundant evidence of the repeated failure of scientists and scientific
communities to even conceive of fundamentally distinct alternatives to extant theories that were
nonetheless both scientifically serious and reasonably well-confirmed by the evidence available at the
time. The historical record he sees giving “us every reason to believe that there are probably such
unconceived alternatives to even the most successful scientific theories of our own day.”
http://faculty.sites.uci.edu/pkylestanford/files/2016/10/UnconceivedAlternativesandConservatism.pdf
24
“…the claims, methods and results of science are not, or should not be influenced by particular
perspectives, value commitments, community bias or personal interests, to name a few relevant
factors. Objectivity is often considered as an ideal for scientific inquiry, as a good reason for valuing
scientific knowledge, and as the basis of the authority of science in
society.”https://plato.stanford.edu/entries/scientific-objectivity/
25
Interaction is what happens in applications of the general linear model when the effects of multiple
variables are not additive.

026463 43
the general practice has been … to write interaction terms into the general
linear model and ignore the complexity that they indicate. This always made
me worry. Linearity and order seemed to be being forced on a world which
isn’t really like that, but I didn’t have a vocabulary for doing more than
worry.” (Byrne, 1998; p.3)

Although with increased computer power, and increased amount of data drawn from
multiple sensors, and sources – the predicative value and veracity of outcomes,
including analysis of interaction effects becomes more feasible and accurate. We can
measure, weigh and count things in the study of substance or structure, but can we
measure or weigh patterns? We cannot measure or weigh patterns. But to study
patterns we must map a configuration of relationships. That is, structures involve
quantities whereas patterns involve qualities. THE idea of measuring things to chart
progress towards a goal is commonplace in large organisations. Governments tot up
trade figures, hospital waiting times and exam results; companies measure their
turnover, profits and inventory. But the use of metrics by individuals is rather less
widespread, with the notable exceptions of people who are trying to lose weight or
improve their fitness. Most people do not routinely record their moods, sleeping
patterns or activity levels, track how much alcohol or caffeine they drink or chart
how often they walk the dog. Material things and intangible ideas or symbols are
only understandable by the context in which they are situated, or in which they are
used.26 They are only relevant communicated as stories. As such our position is much
more like the ethnomethodologists who hold a particular view of context and the
examination of it. Central to this view is indexicality or the notion that the
intelligibility of language is tied to the circumstances for its use (Maynard &
Clayman, 1991, p. 392).27 Or Wittgenstein realism as positing the independent
existence of objects, states of affairs, and facts. In the Philosophical investigations:
“For a large class of cases of the employment of the word ‘meaning’ - though not for
all - this word can be explained in this way: the meaning of a word is its use in the
language” (PI 43).

26
Demonstrating the importance of context and how meaning is not inherent in particular words or
symbols but is connected to the context in which they are uttered.
27
Maynard, D. W., & Clayman, S. E. (1991). The diversity of ethnomethodology. Annual Review of
Sociology, 17, 385-418.

026463 44
BEHOLD FRIEND – THE TOWER!!!!!!! THE IMAGE OF THE TOWER CARD IS
POWERFUL, DEPICTING A SOLID STRONG TOWER BEING STRUCK BY LIGHTNING,
AND FIRE AND MY OPPONENT CRAWLING OUT FROM THE SMALL WINDOWS AT ITS
TOP. THE TOWER IS COMMONLY INTERPRETED AS MEANING DANGER, CRISIS,
DESTRUCTION, AND LIBERATION. IT IS ASSOCIATED WITH SUDDEN UNFORSEEN
CARNAGE ONCE THOSE MARVELOUS REINFORCMENTS KICK IN.

Not only have we evolved such faculties as language [as a system] at the species
level, individual human new-borns come equipped with learning mechanisms that
allow them to change rapidly over their childhood development. As this continues
throughout entire lives, these mechanisms and abilities to take on and apply new
knowledge help them interact with increasingly efficacy with their world, or cope
with the world, even if that world is unlike the one their distant ancestors faced, or
even if they are compromised by the loss of ability, limb or senses. Being alive,
unless one is unfortunate enough to suffer from locked-in syndrome, does not entail
remaining static, and certainly entails aging and lose of faculty. Those changes, these
experiences, whatever they are, are built on the unique neural structure that already
exists, each structure having developed over a lifetime of unique experiences.
Steven Rose pointed out in The Future of the Brain (2005) that a snapshot of the
brain’s current state might also be meaningless unless we knew the entire life history
of that brain’s owner – perhaps even about the social context in which he or she was
raised. But it is also about the present. This is why, as Sir Frederic Bartlett
demonstrated in his treatise Remembering (1932), tells us that no two people will
repeat a story they have heard the same way and why, over time, their recitations of
the story will diverge more and more. No ‘copy’ of the story is ever made; rather,
each individual, upon hearing the story, changes to some extent – enough so that
when asked about the story later (in some cases, days, months or even years after
Bartlett first read them the story) – they can re-experience hearing the story to some
extent, although not very well.

But an opposite is also true in that myths, those personal, shared and cultural artefacts
are stories which coalesce and draw together varied outcomes arising from

026463 45
interpretative elasticity. They are the resilient and enduring stories which act as an
informational glue and to which most people in shared cultural setting can relate to.
The symbolic interactions between parents, siblings and a baby are themselves a
result of culturally held ‘ways of seeing and doing’. 28 All great mythological
creations are an attempt to describe, at the level of common psychological sense, the
entire dramaturgy of our inner and social lives (Goffman, 1955). And interestingly,
they typically form the basis of children’s stories, the ones that would be read to
children at bedtime (Egoff, 1988). 29
Fantasy stories trace their roots back to far older
tales: the myths and legends of various cultures, which grew from oral storytelling in
the days when myths were the only explanation for the mysterious workings of the
real world beyond the grind and toil of everyday life. But some stories, the meanings
evoked by myths remarkably show degrees of similarity across religious stories and
myths from around the world. In Leeming's Mythology: Voyage of the Hero,
following Joseph Campbell, he suggests how many myths follow an eight-stage
pattern that reflects a heroic way of dealing with universal issues in the human
lifespan, from the miracle of conception and birth through resurrection and
ascension.
“On one hand, the themes and characters of myth have enthralled audiences
for hundreds or even thousands of years, and they are likely to retain their
appeal for many generations to come. On the other hand lurks the problem of
creativity: how can a writer come up with new variations on stories that
already exist in hundreds of different versions? In the present day, when
readers place great emphasis upon originality, fantasy stories distinguish
28
Keesing and Strathern in Cultural Anthropology: A Contemporary Perspective (1998) assert two
very different ways in which the term culture is used - The socio-cultural system or the pattern of
residence and resource exploitation that can be observed directly, documented and measured in a
fairly straightforward manner. The tools and other artefacts that we use to create communities, the
virtual environment we create and the way we create, distribute and utilise assets within the
community. These are teaching cultures that are aware of the knowledge that needs to be transferred
to the next generation and which create training programmes. They are characterised by their
certainty or explicit knowability - Millington does pick up on this with a reference to Schein's third
layer of culture: tacit assumptions, or the deeply entrenched patterns of thought and perception that
drive behaviour. The group (here, a company, department or team) has learned these assumptions in
the course of solving historical problems. Culture as an “…ideational system. Cultures in this sense
comprise systems of shared ideas, systems of concepts and rules and meanings that underlie and are
expressed in the ways that humans live. Culture, so defined, refers to what humans learn, not what
they do and make”. This is also the way in which humans provide “standards for deciding what is, …
for deciding what can be,…. for deciding how one feels about it, … for deciding what to do about it,
and … for deciding how to go about doing it. Such cultures are tacit in nature: networked, tribal and
fluid. They are learning cultures because they are deal with ambiguity and uncertainty originating in
the environment, or self generated for innovative purposes.
29
Egoff, Sheila A. Worlds Within: Children's Fantasy from the Middle Ages to Today.
Chicago: American Library Association, 1988.

026463 46
themselves by the degree to which the author employs or abandons the
conventions of mythology.”30

Narratives (stories) represent a particular way of constructing knowledge (link is


external) that comes naturally to us. One writer has even gone so far as saying
"Remembering is narrative; narrative is memory. (link is external)" The Hellenic leap
into abstraction is still worth revisiting and still functions, probably through its
legacy to enlightenment thinking, and even perhaps its mythology, as either ‘machine
code’ or ‘operating system’ for the intellectual basis of what people and businesses
value today. Myth itself, according to Joseph Campbell, represents the human search
for what is true, significant, and meaningful. He says what we are seeking is “…an
experience of being alive…so that our life experiences…will have resonances within
our own innermost being and reality, so that we actually feel the rapture of being
alive”.31 Jung’s therapeutic approach was working with a patient’s dreams and
fantasies. To be of help, then, it is imperative to have knowledge of the details of the
patient’s life as well as knowledge of “…symbols, and therefore of mythology and
the history of religions”. Hillman further insists that he does not view the pantheon of
gods as a 'master matrix' against which we should measure today and thereby decry
modern loss of richness. Like Freud, the psychologist Carl Jung also took myths
seriously. Jung believed that myths and dreams were expressions of the collective
unconscious, in that they express core ideas that are part of the human species as a
whole. In other words, myths express wisdom that has been encoded in all humans,
perhaps by means of evolution or through some spiritual process. For Jungians, this
common origin in the collective unconscious explains why myths from societies at
the opposite ends of the earth can be strikingly similar. In its search for common
patterns across mythologies, Jungian psychology is like a form of Structuralism.32

Much of what we do value today still comes through our engaging with the socio-
30
Berry, D. (2005) The Treatment of Mythology in Children's Fantasy The Looking Glass : New
Perspectives on Children's Literature Vol 9, No 3
https://www.lib.latrobe.edu.au/ojs/index.php/tlg/article/view/29/34
31
Campbell, J. (1988). The Power of myth. New York: Doubleday p.5
32
Eric Csapo, Theories of Mythology, Wiley, 2005
Bruce Lincoln, Theorizing Myth: Narrative, Ideology, and Scholarship, University of Chicago Press,
1999
Andrew Von Hendy, The Modern Construction of Myth, Indiana University Press, 2002
Segal, Robert A. "Introduction." Jung on Mythology. Princeton: Princeton University Press, 1998, p. 4

026463 47
cultural, material, financial, media and natural worlds amongst others, but also comes
via our connected devices, including government services, education, health and
personal health management, and of course social engagement with other people via
‘social media’ and buying and selling online. Whether we call these structures, or
worlds does not matter, the point is that we ‘navigate’ these worlds using different
roles ascribed personally, or placed upon us socially – dad, team member, employee,
user, consumer, citizen and so forth. These devices and roles aim to afford us
freedoms, and lend us constraints, which we, ourselves, can identify with depending
upon our roles. Our mobile devices for instance free us from reliance upon precedent
media devices such as radio and television as sources of news, general information
and entertainment. They aim to intermediate and organise all other engagements,
material, social, financial, etc. and free us from our workshops, desks, our homes,
our shops, government offices. Being classed as personal devices, smart phones,
extend their use and function into, amongst other apps, camera, calendar, alarm
clock, torch, map and so on, as well as innovative and unique applications including
augmented reality. The later lets us roam freely. As we move in the world to
envisage it now tagged or embellished in meaningful ways. There are echoes here of
fundamental evolutionary principles in that ambulatory movement is enabled, and
also the media ecology of everyday life is expanded. Television viewing or search
engine use is no longer confined to a geographic location – i.e. living room or desk –
it now is anywhere, anytime.

Since the enlightenment we may or we may not be 'wired' to believe in God or a


higher power anymore, but today we are asked to believe in algorithms. These sit
alongside Higg’s Bosuns and gene science, but we are social animals who have an
evolutionary need to feel connected to the world and to others. The material, physical
world, its places, its contents, devices, objects and people, now tagged, all blend, link
and mash, they become encyclopaedic or informationally transformed in some other
way in order to locate, inform or entertain. From ancient times we have projected
human thoughts and feelings onto other animals and objects, and even natural forces,
and this tendency is a fundamental building block of philosophy, myth and religion.
Are today’s cityscapes, geopolitical situations and technical, business, media and

026463 48
natural environments echoes of Socrates’ postulate of an ‘ideal world’, or is our
reality still merely a pale ghost-like shadow of it on Plato’s cave wall?

Whilst reading of the classics may have waned in mainstream education the
relevance of Greek thought in both the development and core notion of a peculiarly
Western Civilisation needs no subtext (see for instance the recent intellectual
histories depicted by Hazard, 1935/2013; Israel 2013; Gottlieb, 2016 ).

We can argue that this idea and its acceptance or challenge through time, contributes
to explanation of an incessant hunger to sharpen, hone, improve, streamline,
innovate, make more agile, and any other of these verbs which suggest ‘progress’.
Plato criticized Thrasymachus’s philosophy as “pleonexia” – this was a condition of
insatiable appetites, whereby the satisfaction of one’s own interests assumes the
status of a highest good. The burden of pleonexia, for Plato, is that it can never be
satisfied. Like the torture of Tantalus, the subject of another myth, and like the
purgatory of zealot Christians, it goes on forever teasing, hinting seducing to no avail
nor end. There is no amount of money that ever constitutes “enough.” There is no
amount of public praise or Facebook likes or You Tube subscribers that could ever
satisfy. Pleonexia is a Greek word often translated as greed, but better understood as
pathological condition, the insatiable desire to acquire, accumulate or obtain more of
anything, whether money, property, power or what have you. It is the imbalanced
state of never being satisfied.

The drive to make more profit, or make things more fiscally or functionally more
effective or efficient may be levelled to the status of an ‘ideal world’ – precisely
what would this state of affairs or reality look like to a policy-maker, designer,
producer or marketer if it were indeed ‘ideal’. What is an ‘ideal world’ to a citizen,
user, or consumer? How would we gauge or judge the veracity of idealness?
Xenophanes argued that if animals could paint, they would depict gods in animal
shape, do designers, producers or governmental committees depict their (self-
selecting, self-satisfying) ideal worlds, or the ideal world of any one of their
consumer-user-citizens?

026463 49
It was Aristotle, who sought out purpose (though not divine purpose) throughout the
living world. As the most successful empirical investigator of his day, he developed
formal logic from scratch and served as the original biologist, probably the most
influential investigator until Darwin. And there is also Plato's who in his 'Timaeus'
puts forward his account of cosmology, for many centuries the bible of natural
science. Referring to the Master-Craftsman and his works, Plato wrote that ' the
world is the fairest of creations, and he is the best of causes'). The 'Timaeus' insisted
that the order of the universe and the meticulous planning that seemed to lie behind it
pointed to a higher and rational purpose of some sort, so the 'Timaeus' much later
suited theological ideas much better than the blind, mechanical universe of
Democritus, Epicurus, and Lucretius, and much later Richard Dawkins and
evolutionary biology. Almost every explanation offered in the dialogue returns to the
notion of purpose, and by extension use.

In design of our modern devices, the physical machinery, the graphical interfaces and
in the design of ‘apps’ and programs that run on them and further condition and
make allowances sit the peculiarly western notions of ‘individual freedom’ and
‘democratic principles’, the concomitant virtues of ‘individual rights’ and the
‘individual voice’ ‘reason’, and ‘science’. These ambiguous, all-embracing
portmanteau words, not only persist, but resound heroically through the ages, long
after the fall of Ancient Greece:: “… as long as a hundred of us remain alive, never
will we on any conditions be subjected to the lordship of the English. It is in truth not
for glory, nor riches, nor honours that we are fighting, but for freedom alone, which
no honest man gives up but with life itself.”33

This declaration attends to both a reified notion of ‘freedom’, which is placed on a


par with ‘life’ and notably, not with ‘riches’. Furthermore this document refers to
independence not only of the notion or state, but of the individual regardless of rank
(although it appears gender specific ‘honest man’ rather than ‘honest person’). 34 We
33
https://www.nrscotland.gov.uk/files//research/declaration-of-arbroath/declaration-of-arbroath-
transcription-and-translation.pdf retived 16/04/2018
34
I use the word ‘reification’ judiciously. Hanna Fenichel Pitkin cites twenty alleged meanings,
senses, or aspects of reification which places it as hard to define, and its canonical definition, “a

026463 50
can see such words echoed in later declarations of independence: “We hold these
truths to be self-evident, that all men are created equal, that they are endowed by
their Creator with certain unalienable Rights, that among these are Life, Liberty and
the pursuit of Happiness.”35

These statements crystallised in print broadcast across time and space to all and
sundry. They allude to equality between people, and also between life and liberty or
freedom as virtues worth upholding. The philosopher Isaiah Berlin used them more
or less interchangeably in his essay ''Two Concepts of Liberty,'' and so did the
historian Eric Foner in his ''Story of American Freedom,'' which traces the evolution
of the concept from Colonial times. The dictionary definitions are Liberty: “The state
of being free within society from oppressive restrictions imposed by authority on
one's way of life, behaviour, or political views.” Freedom: “The power or right to act,
speak, or think as one wants.”36 The American declaration also adds ‘the pursuit of
happiness’ to the roster.

Somehow scepticism is the one Greek idea that seems to lose out against the others.
Perhaps this is echoed in the sudden appearance, popularity and confusion arising
from the advent of ‘fake news’ as an attempt to shape and distort people’s perception
of things. This combines with the apparent apathy of people in asserting their
privacy, or the passive reply that ‘I have nothing to hide’.

the best-known pages from his celebrated “Republic,” Plato’s character Socrates asks
his interlocutors to imagine prisoners chained to a cave wall. They have lived there
their entire lives, seeing nothing but the shadows on the wall. Naturally, he observes,
“what the prisoners would take for true reality is nothing other than the shadows.”
relation between people takes on the character of a thing”, raises more questions than it answers.
Some commentators, including Honneth, assume that the core meaning of reification concerns the
treatment of people (including oneself) as things, while others assume that it refers to the mediation of
our relationships with other people by things or to the thing-like character of our social institutions.
“Rethinking reification”, Theory and Society 16:2 (1987): 293, n. 88
35
So states the Scottish Declaration of Independence of 1320 itself held up as a prototype of the
American declaration of independence some 456 years later in Philadelphia The Warriors and
Wordsmiths of Freedom: The Birth and Growth of Democracy Paperback – 30 Apr 2009
by Linda MacDonald-Lewis
Luath Press Ltd; Ill edition (30 April 2009)
36
https://en.oxforddictionaries.com/definition/freedom

026463 51
Fighting and protecting either freedom or liberty or the other rights is the story of so
many lauded heroic figures of history, whether they exist[ed] in fact, myth or fiction.
To name a few: Jesus of Nazareth, Giordano Bruno, Patrick Henry, Martin Luther
King Jr (used ''freedom'' 19 times in his ''I Have a Dream'' speech), Joan of Arc,
Aung San Suu Kyi, Che Guevara, Harry Potter, and more recently, Aaron Swartz,
Julian Assange, Chelsea Manning and Edward Snowdon and of course a countless
amount of service men and women. We know the trope and it is familiar: The
sacrificing of personal concerns and any hope of personal comforts and security
towards contribution towards a greater good for all, or at least, a significant
population or community. Sometimes it is the ultimate sacrifice of one’s life for the
worldly cause. Sometimes it is on behalf for all humanity, sometimes it is for God,
sometimes a nation, monarch or country, sometimes it is for minority groups,
sometimes it is just for one’s family, colleague or friend, it can even be for some
reified notion of freedom itself. However, the sacrifices, the agency and
communications afforded by these sacrifices are often a direct result in the first
instance by our existing freedoms [or lack of them] in the first place (a priori), and
then, they furthermore provide the stage and critical points of reference through
which to test it and enforce it (a postori). And so it comes to pass that some people’s
freedom fighter is or becomes [in historical account] another’s terrorist and vice
versa. The roles are ambiguous and they may change dependent on dominant
ideology and social mores of the time, and similarly, ideas of democracy and
freedom have become reified as they have evolved and diffused as idea and ideology,
permeating into actually quite diverse cultures with contrasting and even conflicting
worldviews (Lent, 2017)37 The benefits of democracy is reciprocal - avoidance of
citizens living in constant fear of a ruler’s arbitrariness, and the self-obsessed ruler
living in fear of his subjects. To pursue great wealth is to abandon the legitimate
moral concerns of others and to focus on one’s own selfish desires, since when one is
consumed with pleonexia, there is room for little else.
37
The Patterning Instinct: A Cultural History of Humanity's Search for Meaning Hardcover – May 23,
2017 by Jeremy Lent - Lent persuasively argues that the Chinese world view is a better model for
humanity in the 21st century than outmoded ways of thinking that view "nature as a machine," or
"nature as something to be controlled."

026463 52
From The Battle of Salamis (480 B.C.) – where the ancient Greeks faced up to the
Persians - human experience would appear mile stoned by events where democracy
and freedom are often hewn, amongst ever present political imbroglios 38, from
manifest oppression, subversion, conflict and revolt, coup d'états, and elevating the
overall theme of them to perhaps the greatest and most prevalent of Jean Jacque
Lytoard’s grand narratives (Lyotard, 1979).39

But there are many others, so many other grand narratives. The story of human
evolution as grand narrative is one which remains committed to a Cartesian model of
cognition and consciousness, in which the entire process of ontological thinking is
abstracted, as I have suggested, reified, elevated if you want, from real-world
context. We make sense of, and discuss history, from the big bang forwards, as well
as other remote places, events, impossible places, fantastic places, using a variety of
media such as books and television. They lull us to a god’s eye perspective on a
history and perspective of distant and exotic things. Think about a gap year student
who pays to teach English at an orphanage in a developing country. Beyond any
genuine of feelings of ‘compassion’ think of the gap in experience and prospects, of
knowledge of where they are, between the teacher and pupil in these situations.

Man as a Tool-using Animal

Thomas Carlyle said: “Man is a Tool-using Animal. . . . Nowhere do you find him
without Tools; without Tools he is nothing, with Tools he is all.” According to the

38
In the second half of the 6th century BC, Athens had fallen under the tyranny of Peisistratos and his
heirs. In 510 BC, at the instigation of the Athenian aristocrat Cleisthenes, the Spartan king Cleomenes
I helped the Athenians overthrow the tyranny. Afterwards, Sparta and Athens promptly turned on each
other, at which point Cleomenes I installed Isagoras as a pro-Spartan archon. Eager to prevent Athens
from becoming a Spartan puppet, Cleisthenes responded by proposing to his fellow citizens that
Athens undergo a revolution: that all citizens share in political power, regardless of status: that Athens
becomes a "democracy". So enthusiastically did the Athenians take to this idea that, having
overthrown Isagoras and implemented Cleisthenes's reforms, they were easily able to repel a Spartan-
led three-pronged invasion aimed at restoring Isagoras. The advent of the democracy cured many of
the ills of Athens and led to a 'golden age' for the Athenians.
39
Grand narrative or “master narrative” is a term introduced by Jean-François Lyotard in his classic
1979 work The Postmodern Condition: A Report on Knowledge, in which Lyotard summed up a
range of views which were being developed at the time, as a critique of the institutional and
ideological forms of knowledge.

026463 53
archaeological record human and societal evolution has been indexed or defined
materially via the development of technology. In even the simplest foraging
societies, people depend on a vast array of tools, detailed bodies of local knowledge,
and complex social arrangements and often do not understand why these tools,
beliefs, and behaviors are adaptive (Boyd, Richerson, & Henrich, 2011;Pinker,
2010). Learning socially, we are able not only to inherit skills and technologies but
also to innovate upon them, leading to a 'ratcheting' process of cumulative cultural
evolution that allows the emergence of cultural traits and technologies that could
never be generated through individual learning alone (Curry, et al, 2018)40

This is the perspective of economists such as Joseph Schumpeter when he alluded to


the idea that innovation is the driving force not only of capitalism but also of
economic progress in general, and that entrepreneurs are the agents of innovation.
Technologies of production and all that goes with it jobs, supply chains, methods of
transacting and so forth comes under constant revision, and some would suggest,
ever faster revision. However, in pre-history innovation cycles would appear much
slower and much more gradual – as natura non facit saltum,” - nature doesn’t make
leaps. If a prize were given for the permanence of human innovations early hominid
stone tools would be hailed as the most popular tool ever made. 41 Also known as
lithics, stone axes and the like offer evidence about how early humans made things,
how they lived, interacted with, and shaped their surroundings and contexts, and how
this relationship or interaction evolved over time. For 2.4 million years lithics were
the dominant technologies for the vast majority of humans clearly a much longer
persistence than Polaroid cameras, radium heated bed warmers, 8-Track Cassettes
and Model-T Fords.

Development in the shape of more sophisticated lithics, sturdy cutting and sawing
edges, are often considered as key occurrences in human evolution over vast
millennia as they are mapped to brain size and abilities. The affordances of these
tools set the stage for better nutrition and more advanced social behaviours, such as

40
Curry, O.S., Happy to help? A systematic review and meta-analysis of the effects of performing acts
of kindness on the well-being of the actor Journal of Experimental Social Psychology (2018),
41
Just as

026463 54
the division of labour and group hunting (organisation). This is a process which also
manifests in the appearance of ancient art as a practice, and the biology of the human
brain and cognition as a processor of ever more complex data and inputs and
communications. Brain scans have shown an overlap in the areas responsible for
toolmaking and language processing, thus leading to a theory that toolmaking
promoted larger brains and that what one was doing was spoken about or referred to,
perhaps in early acts of teaching.42 This overlap seems greater when people are
making complex tools which require improved dexterity, practice and possibly help
(i.e. coordinating activities with another or others). Parietal cave art includes cups or
hollowing in cave walls called cupules, denoted as essentially non-utilitarian,
possessing mainly symbolic function. This is taken as an indicator of a further
significant development in culture and people as it is here that the origins of
spiritualism and its connection to arts is claimed to begin. The devolvement of art
into religious ritual also suggests how as a practice it is the fashioning of visual or
tactile stimulus to evoke an emotional state, and/or to promote recall of past events or
past emotional states. In other words it links material worlds to the cognitive and
affective states. Bednarik and his colleagues (2003a, 2003b) 43 dated cupules in Israel
as being fashioned anywhere between 290,000-700,000 BCE. Other archaeologists
have also alluded to the first mobile art or "mobiliary art" - commonly used to denote
any small-scale prehistoric art that is moveable (mobile) such as The Venus of
Berekhat Ram to have been fashioned 230-700,000 BCE. 44
The vast spread of
possible fabrication times of these phenomena hint that such findings can be
contested and are to some degree pending further verification, open to interpretation.
Counterfactual narratives can be built, and they may dilute pure play ‘origins’ style
narratives, but the basic skills underpinning interaction with material culture are still
present in our more primate relatives in both the old and new worlds (McGrew 1992;
42
Our ancestors had to grow bigger brains to make axes
https://www.newscientist.com/article/dn19677-our-ancestors-had-to-grow-bigger-brains-to-make-
axes/ retrived 4/4/2018 Stout et al. (2008) studied the brain activity of subjects who had become
expert in Early Stone Age tool-making. The tools were of the Oldowan and Acheulian types,
representing a period of some 2 million years during which time the brain of our hominin ancestors
expanded and tools became more advanced. The brain activation detected by positron emission
tomography during tool-making included both visuomotor and language circuits, suggesting that tool-
making and language share a basis in the human capacity for complex goal-directed manual activity.
43
Bednarik RG. A figurine from the African Acheulian. Curr Anthropol. 2003a;44:405–413.
Bednarik RG. The earliest evidence of palaeoart. Rock Art Res. 2003b;20:89–135.
44
Scarre, C (ed.) (2005). The Human Past, (London: Thames and Hudson).

026463 55
de Amoura & Lee 2004; Davidson & McGrew 2005). Perhaps is it not so far-fetched
to imagine and consider the Plasma screen TV on the wall as modern day cave art,
and a smart phone as a portable god statue.

After a long period of time where creative objects and art evolved, and literacy
spread through scribes and religious practitioners we have the luxury of being able to
leap forward in time to consider the origins and impact of print and the printing
press.

Print and the enlightenment

This is considered to also be epochal in the development of our modern media


ecology. Elizabeth Eisenstein (2002) views these techniques and technologies as a
catalyst to social and cultural revolution which begins in fifteenth century Europe’ 45
She has a principally determinist perspective when she announces that the advent of
print and its technologies were :“a radical shift away from all that had gone before,
and that took place independent of contemporary conceptions of it. Cut loose from its
past, print culture thus acted far more as an influence on history than as an outcome
of it. It should be hailed as the engine of modernisation on a continental scale.’
(P.124.)46

Mechanical reproduction of patterns using woodcut and pottery stamps had already
enjoyed a long history of use (for instance in China), but advances in the ability to

45
E. Eisenstein, ‘American Historical Review debate: how revolutionary was the print revolution?',
AHR 107 (2002) containing:
E.L. Eisenstein, 'An unacknowledged revolution revisited', AHR 107 (2002). pp. 87-105.
A. Johns, How to Acknowledge a Revolution, AHR 107 (2002) pp. 106-125.
46
Interestingly and in contrast to Einstein’s thesis is another by Johns (2002) who asserts a more ‘co-
shaping perspective which emphasises localised individual users, user groups and uses rather than the
technology and its products. ‘I have tried to present here something of a draft manifesto for a different
way of understanding the importance of print – one based on the premise that print is conditioned by
history as well as conditioning it. Its major proposal is that we explain the development and
consequences of print in terms of how communities involved with the book as producers, distributors,
regulators and reader actually put the press and its products to use … o ‘The best way to do that, I
have argued, is to adopt a local perspective, at least in the first instance...we need to appreciate the
greater implications of such local researches. In other words, adopting these recommendations should
become the best way to appreciate the achievement of printing as just that: an achievement. It is, I
think, how best to acknowledge what needs acknowledging.’ P.125.’ P.124. This is a theme which
will feature strongly in this thesis and is at the core of the present work.

026463 56
disseminate new ideas by making standardized letters, numbers, and being able to
make diagrams repeatedly vastly improved access to diverse ideas Coupled to
improved usability of printed materials made a difference to thinking in Europe
(Benjamin, 1968).47 Marshall McLuhan most famously in The Gutenberg Galaxy:
The Making of Typographic Man (1962) drew our attention to the prospect that the
shift from a predominantly oral, provincial culture to print culture affected the nature
of human consciousness profoundly.

Print for McLuhan represented, quite literally, an abstraction of thought which gave
precedence to linearity, sequentiality and homogeneity. According to McLuhan, the
advent of print technology contributed to and made possible most of the salient
trends in the Modern period in the Western world and these would include
individualism and democracy. Other trends included the reformation, Protestantism,
capitalism and nationalism. Of course this, to paraphrase McLuhan, was not
dependent only on the medium, but also on the message.

“There had to be new ways of explaining the relations between science and
religion, there had to be a whole new rhetoric, whole new lines of thought,
whole new philosophical techniques which would facilitate and make
convincing this reconciliation of philosophical reason and scientific reason
with revelation and religious authority.”48

Again, undergirding the profound ontological and epistemological shift is wide


ranging ‘effects’ on almost every aspect of life, micro and macro social levels.
Previous to the advent of printed matter and literacy, reading, and therefore
enlightenment, was restricted to the aristocracy, the city based and religious elites.
Common people lived in a world of oral culture and very immediate and functional
communications and relations. In England until around 1500 French was spoken by
the elites, and also Latin. English was the langue of the common people. 49
Language and Culture in Medieval Britain: The French of England, c.1100-c.1500

47
Walter Benjamin (1968). Hannah Arendt, ed. “The Work of Art in the Age of Mechanical
Reproduction”, Illuminations. London: Fontana
48
https://fivebooks.com/best-books/enlightenment-jonathan-israel/
Collette, C., Kowaleski, M., Mooney, L., Putter, A., & Trotter, D. (2009). Language and
49

Culture in Medieval Britain: The French of England, c.1100-c.1500 (Wogan-Browne J.,


Ed.). Boydell and Brewer.

026463 57
On learned subjects the Dutch publishing industry was the freest and the biggest in
terms of quantity of publishing, especially for French and Latin (Isreal). Lynn Hunt’s
idea is very popular, that you’ve got to look at slow cultural processes — people
were reading novels, they developed more empathy for other people, and in this way
universal rights evolved.

For information outside of one’s daily life and goings-on, people collectively heard a
story, or saw a performance, perhaps in a style of social experience one has when
going to a pub to watch a rugby or football match. Eisenstein’s print culture enforces
the impersonal, cosmopolitan and objective. An interesting point to note was the
diffusion and geographic spread of print: ‘the establishment of printing shops
across fifteenth-century Europe is the basis for one story. The industrialisation of
printing processes is the basis of another story. To conflate the two throws time out
of joint even while altering the “familiar printing revolution” beyond all recognition’
Eisenstein (2002) p.104)

This mode of thinking is very much evident not only in rationalist philosophy or
realistic fiction, but also in the rise of scientific materialism till today. Printing led to
the standardization of various European languages as works began to be published in
these languages (this includes English). Eventually this standardization of vernacular
languages contributed toward promoting literatures which were used to create
national mythologies (what it ‘means to be’ British, French etc.). Whereas maps were
in circulation since ancient times, cartography as a science was spawned by the
availability of print. And cartography was not only important in demarcating national
boundaries, but also mapping the territories that were coming to be colonized in the
new world. And Like belief in scientific and social progress this new world was
expanding through discovery and colonisation, and by extension, reporting. This
allowed for an unprecedented level of cooperation among readers, the philosophes
who could now build on each other's ideas over long periods of time.

The resumption of productive philosophical activities we conventionally associate with


Descartes (1596-1650).
Modern European philosophy, which begins in the 17th century when René

026463 58
Descartes tried to cast doubt on everything, thus precipitating a civil war between
rationalists who thought that knowledge is based on reason, and empiricists who said
that it depends on experience.

Thus was born the European “men of letters,” the “Encyclopédistes” and their habitat
– the ‘salon’ - an elite network of scholars and intellectuals who had time and facility
to maintain eclectic interests in the arts, sciences, and, above all, literature. The
writings of Voltaire to remind ourselves that liberty in its purest form—both positive
and negative—can be thought of as the realization of man’s inherent dignity as a
human being. The concept of human dignity has two important implications, both of
which were recognized by Cicero as far back as the first century B.C. but seem to
have been forgotten today. The first is that we all share the same degree of dignity:
No one has any less potential than any other, and no one’s humanity is any less
pronounced than anyone else’s. The second is that our humanity imposes upon us the
same basic needs. By virtue of our nature, we all require food, shelter, clothing,
security, and a range of other basic goods necessary for sufficiency and survival.
Though deceptively simple, these implications have profound meaning when we
consider how individual liberty is to be translated into a social and political
construct. They viewed themselves as having a primarily social function: to reflect,
critique and share in the name of progress. Socially they consisted of the “politically
functionless urban aristocracy with eminent writers, artists, and scientists (who
frequently were of bourgeois origin).” (Habermass, 1991; p.31) 50 As printing presses
and printed material proliferated, as the costs of books, pamphlets and such
decreased so were the seeds sown for more literate populations and the wider
diffusion of ideas, opinions and news of events. But such ideas were not simply
received verbatim without question. Immanuel Kant in the preface to the Critique of
Pure Reason (1781) made the declaration that: “"Our age is, in especial degree, the
age of criticism, and to criticism everything must submit. Religion through its
sanctity and law-giving through its majesty may seek to exempt themselves from it.
But they then awaken just suspicion, and cannot claim the sincere respect which

50
Habermas, J. (1991). The structural transformation of the public sphere: An inquiry into a category
of bourgeois society (Studies in contemporary German social thought [New ed.]). Cambridge, MA:
MIT Press.

026463 59
reason accords only to that which has been able to sustain the test of free and open
discussion" (1983: A, xii).” 51

1650 and 1680 as the first stage of this crisis — which in every other respect, Hazard
describes for us very, very effectively — and the period between 1680 and 1720 as
the second phase of the Enlightenment (Isreal) If it was that the ultimate goal that
this critical movement of the enlightenment was to create reason for the betterment
of society then it was that this reason that would have to stand the “test of free and
open discussion”. Instead of accepting traditional religious worldviews at face value
or at the command of the church leaders, one would now question. Rather than
adopting the values of established authority one would ask ‘why?’ In order to ask
such questions one had to believe in oneself and one’s faculties to reason, and so
enlightenment thinkers elevated the role of the mind and emphasized the power of
reason, thus leading to the abolition of customarily accepted moral and religious
absolutes and norms.

Once considered the sole source and locus for human life, experience, and
expectations, religious thought and belief were increasingly pushed out of the realm
of nature and history. There was a demystification of the world and its goings-on. By
the sixteenth and seventeenth centuries the world and its contents, including people
individually and en mass would now be the object of scientific inquiry in the through
a process of desacralisation. “As the physical world ceased to be a theatre in which
the drama of creation was constantly re-directed by divine intervention, human
expressions of religious faith came increasingly to be seen as outcomes of natural
processes rather than the work of God or of Satan and his legions.”52

The protagonists of the American and French Revolutions of the late 18 th Century
entailed considerable bloodshed and the upheaval of existing social orders; they
challenged the “divine right of kings”. They sought to replace this with constitutional
democracies. And so for some time now the leading Western nations have acted upon
51
Immanuel Kant (trans Humphrey Palmer ed.)(1983) Kant's Critique of pure reason an introductory
text by Cardiff: University College Cardiff Press.
52
Peter Harrison, “Religion” and the Religions in the English Enlightenment (Cambridge: Cambridge
University Press, 1990), p. 5.

026463 60
the totems of democracy and freedom under a view that they are the panacea to [and
not the cause of] all forms of political conflict and civil unrest. Our daily news
remains consumed with contemporary renditions of how democratic principles are
being championed in the name of oppressed citizens everywhere. If you look
carefully you will note that this is typically in relation to other, non-western, less
industrialised populations and marginalised groups. These others, always and
somehow, paternalistically, ‘deserve better’.

This is the gloss which typically applied, is at the same time for all intents and
purposes based upon a deep disregard of, or at least a poorly formed sense of, local
historical and cultural conditions and values. This is why interventions often lead to
unforeseen or unanticipated local and global conflict, disaster and chaos, and more
often taking of life, liberty and happiness. The geopolitical and military deadlocks in
the Middle East are just one example of severe and long-term implications of
misguided efforts to engage in ‘nation building’. It seems that the ultimate goal of an
informed foreign policy must be to encourage, at all costs to life and limb, the
emergence of democracy in countries which are viewed as not yet benefiting from its
advantages, even if this takes the intervention of military force and the insidious
promotion of insurrection to secure it. And yet, elites in those same nations
propagating these actions, in some strange irony, at least discourage, if not actively
and intentionally supress dissent, protest, dissent and ‘whistleblowing’ within their
own ranks all under the all-embracing banner of ‘national security’.

This ethical ambiguity same can be said of scientific and technical progress so
relentless in its power to replace religion as a belief system. Science is one human
activity that for all appearances shows constant and steady progress, so letting
science set the rules seems the way to ensure constant progress in all things including
economies. Progress in the economic sense cannot be hindered. Immanuel Kant
defined the Enlightenment itself as the “progress of mankind toward improvement”
through the “freedom to make public use of one’s reason on every point,” The Ethics
of Belief by the 19th-century British mathematician and philosopher William
Kingdon Clifford states something similar: "It is wrong always, everywhere, and for

026463 61
anyone, to believe anything upon insufficient evidence." Meanwhile Joel Mokyr
speaks of a “market for ideas”, a system in which people “try to persuade an
audience of the correctness of their beliefs”. The prevailing idea which is most
reasonable is often taken as that which is more sound, more resilient, and
persevering, a trust that if we add substance x to substance y we get the beneficial
substance z. It often fails to account for by-product or artefact w, unforeseen
outcome a, side-effect b, derivative s, offshoot p etc.

Like any market, technology or democrasy can “fail” - and, for most of history, it
did. People in power stopped upstarts from challenging received wisdom.
Manipulating nature was considered akin to defying God’s will, we enact laws
supposed to prevent laws being broken, infringements of weeds into garden borders,
walls to keep people in, or out, or both and so on. David Edgerton has different take
on the relm of innovation and its centrality in human affairs than economist such as
Shumpter. He notes that a “history of technology-in-use,” would yield “a radically
different picture of technology, and indeed of invention and innovation.” dgerton
calls the tendency to overrate the impact of dramatic new technologies “futurism.”
Few things, it turns out, are as passé as past futures.

During the period of the American and French Revolutions, scientific and
enlightenment thought at the time, are cited as giving rise to the whole social project
of modernity with its underlying premise of ever continuing social, economic and
technical betterment. Given that living conditions, lifespans and health has improved
globally, even with massive population increases then arguably it has been successful
as a project (Vaupel, 2010, (Horiuchi et al., 2013)53

From Francis Bacon in the seventeenth century, and others such as Voltaire in the

53
Availability of food resources, improved living conditions and advances in basic and medical
sciences have greatly extended the life span globally (Vaupel, 2010) J.W. Vaupel
Biodemography of human ageing
Nature, 464 (2010), pp. 536-542Close
S. Horiuchi, N. Quetelet, S.L.K. Cheung, J.M. Robine
Modal age at death: lifespan indicator in the era of longevity extension
Vienna Yearb. Popul. Res., 11 (2013), pp. 37-69

026463 62
eighteenth century, down to the present, central to modernist thought was a focus on
empiricism and rationality. John Locke too is “often misunderstood”: he has a
reputation for preferring “experience” to “innate ideas”, but his main point was that
the mind is not a passive receptacle but an independent agent in the construction of
knowledge; and in spite of being celebrated as the architect of modern liberties, he
was far from being a liberal, “even by the standards of his own times”. (Gottlieb)

Also captured was a further prevailing idea which endures, scientific inquiry
divorced from the mindless piling up of facts towards making a difference to
people’s lives. While modern science practice may have revised many of the truths
adopted by Bacon and his contemporaries, for the large part we still utilize Bacon's
1620 method of proving knowledge to be true via doubt and experimentation. “The
true and legitimate goal of the sciences is to endow human life with new discoveries
and resources,”

To have knowledge for Plato is to use it. In the nature of things there is
no gap between knowing and doing… To the extent that someone embodies
knowing he will act, for knowledge itself is not complete without its exercise.
Knowledge is being and power. To the extent knowledge is not present there
is no power from that domain.. To have knowledge for Plato means that a
person embodies knowledge so that he acts and is the knowledge. (Warren,
1985, p.1354

For the pragmatic schools, truth lies in its usefulness; thus, the useful is true, what is
efficient for action. According to Bacon, truth and usefulness were equivocal, but
received wisdom has been to think in terms of a divide between 'high science', or
pure ‘science’ which aims at truth, and 'low science', which aims at usefulness,
probably at applications and technology and result. For William James truth lies in
the predictability and the usefulness for action provided by greater control.

At a similar time to these political revolutions the industrial revolution was taking
place with a move from small-scale agricultural production to large, factory based
manufacturing. This change brought profound social changes. Globalisation was also

54
Warren, Edward, "Plato's Refutation of Thrasymachus: The Craft Argument" (1985). The Society
for Ancient Greek Philosophy Newsletter. 124. https://orb.binghamton.edu/sagp/124

026463 63
growing, with western nations expanding their reach into the world, partly under the
auspices of civilising and modernising, achieved and realised partly through the
agency of their technology and the worldview afforded by it (Adas, 2015) 55

In the year 1000 the average person in Western Europe was slightly poorer than their
counterparts in China or India. By 1900, things were very different. Western Europe
was five times richer. Prior to the industrial revolution most people lived and worked
in rural areas in small communities. Rosalind Mitchison stresses that the old Scottish
system, where alms were dispensed by the church, gave extremely inadequate relief
to the poor, usually just some oatmeal, expecting them to beg in order to survive. She
evens cites an instance when a parish stripped a five-year-old orphan of poor relief,
telling her to take the roads to beg or work. 56
Muller (1961, p. xiii) in his Freedom
in the Ancient World: "The condition of being able to choose and to carry out
purposes." This means that the individual is neither hampered by external constraints
nor coerced to do other than he or she wills. It assumes an ability, coupled with a
positive desire, to make a conscious choice between known alternatives. With many
people being employed in similar professions (often in factories) new groups and
social classes began to emerge and move to the city. The scientific
revolution/Enlightenment led to the industrial revolution - the huge production of
economic growth after 1800. (Mokyer, 2016) 57The new technologies which were
developed in the industrial revolution (such as the printing press and the steam
engine) also changed peoples’ lives in many ways and helped to produce a sense that
society was changing. Previously society had been assumed to be largely static or to
be in decline from a perfect past. Mobility, social and geographical, for most people
was limited. As books, pamphlets, journals, and papers proliferated during the
eighteenth century and public debate, in turn, crafted public opinion that began to
stand as a counter authority to kings, religious leaders, and states. They also brought
word of a better life and future.

55
Michael Adas (2015) Machines as the Measure of Men: Science, Technology, and Ideologies of
Western Dominance Cornell University Press
56
Rosalind Mitchison. The Old Poor Law in Scotland: The Experience of Poverty, 1574–1845.
Edinburgh: Edinburgh University Press. 2000
57
A Culture of Growth: The Origins of the Modern Economy. By Joel Mokyr. Princeton University
Press

026463 64
People started to be more optimistic about the potential for a better society through
technological advancement and the changes they could witness around them. Goods
and services which were only the prevail of the upper classes began to be more
widely available and affordable. “Mass production is a way of manufacturing things
en masse (and for the masses) that takes the initiative for choosing products out of
the hands of the consumer and puts it into the hands of the manufacturer… not until
the late 20th century did the development of the internet make it seem possible that
the initiative in the buyer/seller relationship would shift back, out of the hands of
manufacturers and into the hands of consumers.” 58 Also improved healthcare came as
a result of scientific endeavour and the wider circulation of published results. Today
the moniker ‘quality of life’ is understood in terms of independence and autonomy to
participate in everyday life (Dresen 1978, Pennacchini et all, 2011)59

Conclusion

Democratic values have today permeated into processes of design and production,
consumption and use of products, machines, systems, devices and services where
today by getting closer - more intimate - with consumer-users, engineers and
interface designers gain insight on how they can improve what they make ad n how
they make it. Industry 4.0 entails an attempt to ‘listen’ to the ‘voice’ of the customer
with each exercise aimed at improving design both functionally and aesthetically,
whilst reducing the financial risk of product and service failure, marketing failures, It
is also learning how best to streamline and make more efficient the provision and
delivery of services [including social and governmental services] and opening new
innovation possibilities.

58
https://www.economist.com/node/14299820
59
It is actually a slippery term “an ambiguous and elusive concept, widely used in all fields of
knowledge and human existence.” Barcaccia, B., Esposito, G., Matarese, M., Bertolaso, M., Elvira,
M., & De Marinis, M. G. (2013). Defining Quality of Life: A Wild-Goose Chase?. Europe’s Journal
of Psychology, 9(1), 185-203. doi:10.5964/ejop.v9i1.484 Dresen, S.E., (1978) Staying well while
growing old:autonomy a continuting developmental task American Journal Of Nursing 78: pp.1344-
1346 Send to
Clin Ter. 2011;162(3):e99-e103.
A brief history of the Quality of Life: its use in medicine and in philosophy.
Pennacchini M1, Bertolaso M, Elvira MM, De Marinis MG.

026463 65
There is an argument that life, liberty and the pursuit of happiness, or at least making
everyday tasks and routines, including keeping in touch with each other, easier or
more convenient is what is core, and that in the digital revolution, unlike, the French
and American versions, will entail no bloodshed apart perhaps from decapitations on
You Tube and other unsavoury images real, staged or imagined. And yet the social,
cultural, technical and economic implications are possibly more far-reaching and
prevalent, such as the power of such media to insidiously engender democratic
political change or channel beliefs in other directions. So the trope goes on for the
digital revolution…

Democratic principles may have waxed and waned since the times of the ancient
Greeks, as has the plight of the common people, who may have been ‘free’ but were
more often than not impoverished. It is clear indeed that comforts, social justice, and
financial equality have each been vastly improved by social catch nets and taxation-
based welfare systems suggested by modern state institutions, but in the wake of an
apocalyptic style of breakdown arising from a natural or man-made [hacker?]
disaster populations would again be taken back to a rebalance based upon what was
found and cultivated in nature, the local and immediate environments. As typically
depicted in apocalyptic movie dystopias, the social would again be occupied by only
those we encounter around us in the goings-on of everyday life. Pecking orders
would no doubt come into play perhaps in the same manner they emerged during the
period when humans were hunter-gatherers and finally settlers and agriculturalists.
Grandiose worldviews, imaginations and perspectives of time and space would again
collapse. The digital post-industrial age, to be useful to us, to offer any sort of value,
rests upon and remains totally reliant upon the auspices of its predecessors the
industrial (making things effectively and efficiently, making energy) and agrarian
ages (food and water supply, growing things effectively and efficiently).

Global living standards are on an upwards trajectory. To emphasise the positive is


not to diminish or ignore the immense suffering people all over the world still
experience. After all, it’s not much consolation to a Syrian forced to flee their home
to know that global living standards are on an upwards trajectory. Nor should we be

026463 66
complacent about the all too real risks of environmental degradation, financial
collapse or military confrontation.

But it is to underline that the story of humanity in last few hundred years is of ever-
increasing improvement, and that catastrophic events derive their power from their
rarity. As Rosling argues persuasively: ”Is it helpful to have to choose between bad
and improving? Definitely not. It’s both. It’s both bad and better. Better, and bad, at
the same time.” UN and World Health Organisation statistics, is a rejoinder to the
idea of an “us and them”, “rich vs poor” idea of the world. Instead, we should think
about four different levels of income, with increasing numbers of people moving out
of Level 1. This is both a social and economic phenomenon, driven by more literacy,
huge advances in girls’ education and much lower birthrates. That 9 per cent of the
world’s population are still on Level 1, living on less than $1 a day, should give us
all pause for thought. That the number has fallen from 29 per cent in the space of just
20 years ought to be cause for justified celebration. recent story about London having
the same violent crime rate as New York — a claim based on just two months of
data, and completely ignoring the fact the latter has seen a huge reduction in violent
crime since the early ’90s. Those details did not stop over-the-top headlines about an
“epidemic” . new technology has not been the cause of mass unemployment is that
new kit will only be used when it makes the productive process more profitable.
Higher productivity frees up the resources to buy other goods and services. The rural
workers that Thomas Hardy described in Tess of the D’Urbervilles found work in
factories and offices. What’s more, it was better paid work, and so the upshot was an
increase in living standards. Producers will only automate if doing so is profitable.
For profit to occur, producers need a market to sell to in the first place. Keeping this
in mind helps to highlight the critical flaw of the argument: if robots replaced all
workers, thereby creating mass unemployment, to whom would the producers sell?
Because demand is infinite whereas supply is scarce, the displaced workers always
have the opportunity to find fresh employment to produce something that satisfies
demand elsewhere.” In the modern economy, the jobs that are prized tend to be the
ones that involve skills such as logic. Those that are less well-rewarded tend to
involve mobility and perception. Robots find logic easy but mobility and perception

026463 67
difficult. Inequality, without a sustained attempt at the redistribution of income,
wealth and opportunity, will increase. And so will social tension and political
discontent.

On an individual level

Viktor Frankl the neurologist and psychiatrist quoted the philosopher Friedrich
Nietzsche when he undergirded his thoughts in Man's Search for Meaning (1946,
p.76): “he who has a why to live for, can bear with almost any how.” 60
In his
accounts Frankl draws upon his harrowing experiences in Nazi concentration camps
to make the claim that the most basic human motivation is the will to meaning. He
observed the way that both he and other inmates in Auschwitz coped, or failed to
cope, with the dire experiences they encountered. He noticed that it was persons
who comforted others and who gave away their last piece of bread that were those
who were also the most resilient and survived longer. For him such people did not
necessarily possess some greater moral sensibility, nor were they necessarily more
altruistic or benevolent. Rather they offered proof to him that all can be taken away
from the person except the ability to choose our attitude or belief in any given set of
circumstances. “Everything can be taken from a man but one thing: the last of the
human freedoms - to choose one’s attitude in any given set of circumstances, to
choose one’s own way.” (p.86) He sees that man (sic): "does not simply exist but
always decides what his existence will be, what he will become in the next moment."
(p.116) “It is this spiritual freedom - which cannot be taken away - that makes life
meaningful and purposeful.” p. 67 He formed the opinion that at some level of
command and consciousness, we retain the power to choose even in the most dire,
adverse and impoverished situations and circumstances. “When we are no longer
able to change a situation, we are challenged to change ourselves.” This then
suggests a deep and innate psychodynamic flexibility which helps us as people to
cope with whatever life presents to us. Would this include a malfunctioning mobile
device, a poorly designed government web site, a new rule, law or dictate, a parking
ticket and a flat tire? Does such flexibility only attend to the lower echelons of

Maxim 12 at the beginning of “the Twilight of the Idols” actually reads: ““If we have our own
60

why in life, we shall get along with almost any how.”

026463 68
Maslow’s Hierarchy, the brutish and primitive fight for survival, or does it feature in
the hum drum of everyday life and its minutiae, goings-on, and routines?

There are resonances of this in other fields such as economics where Ludwig von
Mises drew attention to the importance of purposeful behaviour in the development
of his science of praxeology. This had it that human action, i.e. purposeful behaviour,
is teleologically based on the presupposition that an individual's action is governed or
caused by the existence of chosen ends. Marginal utility is a theory which also
originates from the field of economics. It suggests that every decision we make is
based on the getting the greatest potential benefit, known as marginal utility, from
the choice (the optimal decision) that we make. Or in other words an individual
actively selects what they believe to be the most appropriate means (from their
previous experience, biography or knowledge of historical antecedents) to achieve a
sought after goal or end. “It is a peculiarity of man that he can only live by looking to
the future…And this is his salvation in the most difficult moments of his existence,
although he sometimes has to force is mind to the task.” p. 73

Frankl’s advice may be viewed as an optimistic allusion, standing in


contradistinction to the existentialists and absurdists such as Sarte and Camus who
held a much more passive and fatalistic view of the human condition. From their
perspective we have no choice but to accept the meaninglessness of our existence.
From this perspective the social and material world which acts as medium for our
existence determines us, and as such we have a liability to adapt or accept our
position or stance. This is critical to any discussion of the notion of free will,
meaning and agency. It rather suggests a much more passive role, a prospect of
changing [oneself], and the power and jurisdiction of interpretation (Camus, 1991)61.

In mundane everyday life at home and in work there is clearly a need to order and
organise our motivations and strategies so as to target subsequent behaviours and to
define just what constitutes a positive outcomes. We will consider some choices,
such as when house-hunting, for weeks or even months. But the majority of decisions

61
Camus, Albert (1991). Myth of Sisyphus and Other Essays. Vintage Books.

026463 69
we make, such as which clothes to wear this morning or what to eat for lunch, are
made routinely without substantial conscious consideration on our part. An
abundance of choice in our everyday lives is regarded as a luxury of modern living.
Some have argued that capitalism offers a false sense of freedom while taking
advantages of opportunities to “imprison” consumers (i.e. Bonsu and Darmody
2008).62

Fashion choices are an expression of our personality and even a person's preferences
for particular brands when shopping may be considered as extensions of their
persona. The option to choose is something that we no longer just value but now
expect - it is a prerequisite to all democratic institutions that people are given the
option to choose between two or more candidates or ideas. Drawing on Gibson’s 63
ecological approach to perception, the experience of meaning has, at its foundation,
the encounter between a lawful physical world and unconscious primitive
mechanisms that are capable of detecting the invariants of that world. Gibsonian
ambulatory perception, which emphasizes the sensory capacities of the body as the
primary means of engagement with the world (Gibson 1979) 64, and to ‘embodied’,
‘extended’ or ‘distributed’ approaches within cognitive science (see Anderson
(2003)65 for review, also Hutchins (2008)66),We can propose that the world makes
sense and that the subjective feeling of this sense, when noticed by awareness, is the
experience of meaning. For our purposes, then, meaning is a fundamental aspect of
awareness associated with the detection of lawfulness, regularity, and pattern. The
ability to reconceptualise or refigure what and how to target links us to other
cognitive theories and neural concepts including the executive function (i.e.
Funahashi, 2001). He suggests this as: ‘‘a product of the co-ordinated operation of
various processes to accomplish a particular goal in a flexible manner.” 67
The
62
Bonsu, Samuel K. and Aron Darmody (2008), “Co-Creating Second Life: Market Consumer
Cooperation in Contemporary Economy”, Journal of Macromarketing, 28, 355-368.
63
Gibson JJ (1977) The theory of affordances. In: Shaw R, Bransford J (eds) Perceiving, acting, and
knowing: toward an ecological psychology. Elrbaum, Hillsdale, pp 67–82
64
Gibson J.J. Houghton Mifflin; Boston, MA: 1979. The ecological approach to visual perception
65
Anderson M.L. Embodied cognition: a field guide. Artif. Intell. 2003;149:91–130.
66
Review The role of cultural practices in the emergence of modern human intelligence.
Hutchins E
Philos Trans R Soc Lond B Biol Sci. 2008 Jun 12; 363(1499):2011-9.
67
Funahashi. Neuronal mechanisms of executive control by the prefrontal cortex. Neurosci Res 2001;
39: 147–65

026463 70
executive function is necessary for the cognitive control of all that we do as well as
selecting and successfully monitoring behaviours that facilitate the attainment of
chosen goals.

This idea of choice in what we do and practice is defined through reference to some
purpose, end, goal, or function. It is concerned with a partition of outcomes where it
has been applied to animal and human behaviour.
“Human conduct, insofar as it is rational, is generally explained with
reference to ends or goals pursued or alleged to be pursued, and humans have
often understood the behaviour of other things in nature on the basis of that
analogy, either as of themselves pursuing ends or goals or as designed to
fulfill a purpose devised by a mind that transcends nature.” (Casey, 2008)

Having choice, being able to make choices, is seen as a positive. You are about to
hire a new employee and have several suitable candidates. At the place of choice,
you can choose among any of the available options. One is just as good as another.
However, we need to take into account the company’s needs, each candidate’s skills,
experiences, etc. This is the place of decision — where we have limited our possible
choices by all relevant factors. This is the core distinction between choice and
decision. Choice connects to the place of desired intention, values and beliefs.
Decision connects to the place of behaviour, performance and consequences. You
might say that choices are connected to reasons and decisions are connected to
causes.

A range of work by other researchers has also substantiated this idea. Amongst these
perhaps most prominently is the theory of planned behaviour (TPB) (Ajzen 1985,
1991; Ajzen and Madden 1986; Armitage and Conner 2001; Hardeman et al. 2002;
see also Rutter and Quine 2002; Munro et al. 2007; Nisbet and Gick 2008; Webb et
al 2010) remains one of the most widely cited and applied behavioural theories. One
of a closely inter-related family of theories which adopt a cognitive approach to
explaining behaviour which centres on individuals’ attitudes and beliefs, the TPB
evolved from the earlier theory of reasoned action (Fishbein and Ajzen, 1975) which
posited intention to act as the best predictor of behaviour.

026463 71
Intention to act from this perspective is understood to be in itself an outcome of the
combination of attitudes towards a given behaviour or routine. That is the positive or
negative evaluation of the behaviour and routine and its expected outcomes.
However it also involves subjective norms, such as the social pressures exerted on an
individual resulting from their perceptions of what others think they should do and a
consumer-user’s inclination to comply with these. The appropriation of devices and
technologies which enjoy a wide diffusion, such as televisions and mobile phones,
now carry with them a kind of expectation which goes beyond their basic function
(an idea picked up by Silverstone, Morely and Hirch). There is for instance an
expectation that in entering someone’s living room we will see a television on the
wall or in the corner of the room, or that someone will be carrying their mobile
phone in their pocket. And more than this there is an implicit expectation that these
devices will operate and perform in the same manner as those you are more familiar
with in your own home and your own pocket. All behaviour from the mundane
switching of television channels by the remote control, to rioting and extreme
criminal acts such as murder and rape depend upon processes of expectations and
evaluation.

Zimbardo’s infamous Stanford Prison Experiment (1971) and the equally notorious
Milgram experiment (1961)68, showed how easily and how quickly, social relations
and values came to be reconfigured. Both studies suggested that the situation, rather
than individual personalities, caused the participants' behavior. 69 In the first example
(Zimbardo) students given the role of either ‘guards’ or ‘prisoners’ started to take on
these roles and enact what they saw was appropriate role behaviours. Guards became
authoritarian and prisoners raised dissent etc. And in the second example (Milgram)
even when subjects were concerned or apprehensive about giving the person in the
68
Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology,
67, 371-378.

Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human relations,
18(1), 57-76.

Milgram, S. (1974). Obedience to authority: An experimental view. Harpercollins.


69
Zimbardo, P. G. (1971). "The power and pathology of imprisonment", Congressional Record (Serial
No. 15, 1971-10-25). Hearings before Subcommittee No. 3, of the United States House Committee on
the Judiciary, Ninety-Second Congress, First Session on Corrections, Part II, Prisons, Prison Reform
and Prisoner's Rights: California. Washington, DC: U.S. Government Printing Office.

026463 72
next room potentially lethal shocks as they were instructed to do, they still did, given
the acknowledgement and permission of the supervisor. This led Milgram later to
claim: “Could it be that Eichmann and his million accomplices in the Holocaust were
just following orders? Could we call them all accomplices?" (Milgram, 1974).

The theory of planned behaviour added a third set of factors as affecting intention
(and behaviour) and this was perceived behavioural control. This is the perceived or
anticipated ease or difficulty through which the individual will be able to perform or
carry out the behaviour, and is very similar to notions of self-efficacy (see Bandura
1986, 1997; Terry et al. 1993).70 Basically this suggests that if we believe something
is going to be difficult to use, access, or consume, not only will this affect what takes
place, i.e. the experience of consumption and use, but that other possibilities may be
sought out in their wake. This has direct implications to the design of interfaces and
devices. If they are seen to be difficult to use, people may ‘switch off’ and avoid
even trying to use them. Thus according to these theories there is a strong argument
for designing and checking for good usability, and generating publicity of such, as
good industrial marketing practice.

The role of the material environment

The notion that non-human ‘actors’ play a role in providing the conditions and
environments of, and in causing certain outcomes or ‘behaviour’ draws upon the
actor-network theory of Bruno Latour and others, for instance Elizabeth Shove
(2010a) notes:
“Put simply, roads, railways, freezers, heating systems, etc. are not innocent
features of the background. Rather, they have an active part to play in
defining, reproducing and transforming what people take to be normal ways
of life. The key insight here is that the material world and related systems of
production and provision are important in organising, structuring and
sometimes preventing certain practices”.

The three elements model which derives from Shove’s work consists of:

70
Importantly for the present study we will also see later how this links to ideas of use, and usage
within the context of usability.

026463 73
Materials: The physical objects that permit or facilitate certain activities to
be performed in specific ways.
Meanings: Images, interpretations or concepts associated with activities that
determine how and when they might be performed.
Procedures: Skills, know-how or competencies that permit, or lead to
activities being undertaken in certain ways.

If we follow the idea that we never act independently of some form of planning or
advanced thinking regarding materials, meanings and procedures, as societies, as
firms and as individual designers and consumers. We always look before we leap,
even if we know that longer term strategic thinking is always plagued with
difficulties.

“The internet, global warming, artificial intelligence and genetic engineering


were all on our radar in 1956. But our ideas about how they might pan out
bore little resemblance to how they have actually evolved, particularly when
it comes to their social ramifications. Ubiquitous information has not created
rationalist utopias, ecological catastrophes have not culled our population and
we have neither super-human machines nor people, though we’re getting
there.” 71

To proceed otherwise would be reckless, irrational, and potentially irresponsible and


dangerous. The history of industrialisation and mass production is a history of
unforeseen consequences. So we need some ability to reframe and review our
circumstances and situations in new and unique manners whether this comes through
force of what is understood as changing external conditions (such as social,
technical, economic, geographical etc.), or internally (what we think or feel is
possible or right, the gut feeling or based upon reflection and memory). This in turn
needs to bind to a power to decide or settle on how to look at life and its situations to
visualise new outcomes and accommodate hindrances, obstacles and challenges. Jeff
Hawkins in On Intelligence (2004) presented a theory of how the brain works and
where the central premise being that prediction is the fundamental component in our
understanding of intelligence.

“In a flash of insight he had while wondering how he would react if a blue
coffee cup suddenly appeared on his desk, he realized that the brain is not
71
https://www.newscientist.com/article/mg23231000-500-weve-seen-the-future-and-it-will-blow-
your-mind/

026463 74
only constantly absorbing and storing information about its surroundings - the
objects in a room, its sounds, brightness, temperature - but also making
predictions about what it will encounter next.”

His working hypothesis is somewhat like John Locke’s notion of Tabla Rasa, that a
human brain at birth is somehow vacant, a blank canvas. It knows nothing, has no
memories, and has neither sense-making apparatus nor category systems. In such a
model the brain is not predisposed to any ideas, learning entirely from experiential
inputs, from what it senses, registers and records. 72 From the stream of sensory data,
stimulus followed by non-stimulus, it builds a model of the world, and then makes
predictions based upon that model. And just as a child learns that your hand burns in
the candle flame and you fall when you are on a raised platform, the intelligent
system that Jeff Hawkins builds, learns to predict how technologies such as power
networks will fail. Behaviourism??????

As we flit across ideas concerning conceptual categories of humanness, cognition,


beliefs, identity and attitudes, personality and values we do witness recurrent notions
of fluidity, fixity and flexibility with respect to the individual person and their
information relations to the world around them. We can hardly argue with the fact
that that the smart phone as personal assistant or extension to what we do or know, or
even how we live, changes and intermediates our relationship to the world. Our
conception of, and to some extent, our manifest behaviours towards the social and
commercial worlds changes in ways that were difficult to previously realise or
comprehend. We require flexibility, at times deft agility to behave in accordance
with these new views and conditions, novel circumstances and situations, and our
systems and technology are often seen by developers as a solution to this. Those who
do not adapt to the new systems, are left ‘behind’, left ‘outside’, are the digitally

72
There is a species of tick, a little round creature that gets in your skin. The baby tick when he
hatches, walks up a tree, and remains on a twig until he smells butyric acid, that is sweat. If and when
he smells butyric acid, he falls off the tree and hopes to fall on a mammal.
If in two or three weeks, he does not smell any sweat, he falls off the tree and climbs another tree. It is
the absence of butyric acid, in the end, has the same effect that the presence of butyric acid would
have. He’s able to work at two logical type levels. He is able to deal with the absence of information
as a piece of information. The information which doesn’t come is itself information.

This is dreadfully important in the whole relation of figure relation, and so on. — Bateson
https://archive.org/details/Lecture_by_Gregory_Bateson_part_2_1975_75R004

026463 75
‘marginalised’. Their aim is to reduce risk of error, otherwise they are aimed at
improving convenience or ‘quality of life’ or ‘living conditions’, possibly safety and
comfort, but simultaneously, as they accomplish this [even partially, or even not at
all] they also create new forms of risk and can lead to frustrations when they
breakdown or fail to perform or deliver.

“Modern life has made it easier than ever before to live on the surface of
existence and apparently without the need for any deeper sense of purpose.
The pace of life for one thing gives us little time for reflection, and this suits a
lot of us very well. In developed countries, we don’t have to worry about
basic survival and we enjoy greater and greater freedom to cut our own
course in life. This is one of the great achievements of Western culture, a
genuine liberation -- for those who can enjoy it. But the down side is that it is
easier than ever before to avoid commitments and to evade dependence --
both the dependence of others on us and the dependence we have on others.
The worst thing you can be in personal relationships, so we are told, is
dependent. The second worst thing is to have a partner who is dependent on
you.”73

Cognitively, however, and probably emotionally as well, we do often prefer things to


at least appear stable. We want them to be constant and fixed to us in order to first
identify and reflect upon what we are apprehending then, to make plans, produce
milestones, to know and become familiar with things, to master tools, devices,
processes, to know institutions, other people, social mores and conventions,
languages, technology, geographical locations. It is only then that we make sense of
what things may be used for, and how they may be used. The idea that a futurist
should also look back is broadly shared by scholars in the futures field (e.g.
Heilbroner, 1960; Galtung & Inayatullah, 1997; Inayatullah, 1999; Kaivo-oja et al.,
2004; Rescher, 1998; Wagar, 1993).74 They argue that future developments cannot be
conceptualised without having history in mind.

73

https://www.mercatornet.com/articles/view/the_ultimate_conversation_stopper_does_life_have_mean
ing/3103
74
Heilbroner, Robert L. (1960). The Future as History. New York, NY: Peter Smith Galtung, Johan,
& Sohail Inayatullah. (Eds.) (1997). "Macrohistory and macrohistorians." Perspectives on Individual,
Social, and Civilizational Change. Westport, CT & London, UK: Praeger.Kaivo-oja, Jari Y., Tapio S.
Katko, & Osmo T. Seppälä. (2004). "Seeking convergence between history and futures research."
Futures, 36(6), 527-547. Rescher, Nicholas. (1998). Predicting the Future. An Introduction to the
Theory of Forecasting. Albany, NY: State University of New York Press. Wagar, W. Warren. (1993).
"Embracing change: Futures inquiry as applied history." Futures, 25(5), 449-455.

026463 76
With the rise of modern science during the 16th and 17th centuries, interest was
directed to mechanistic and predictable explanations of natural phenomena, with the
ultimate goal at elucidating efficient causes. Natural teleology, common in classical
philosophy, but controversial today, contended that natural entities also had intrinsic
purposes, irrespective of human use or opinion. For instance, Aristotle claimed that
an acorn's intrinsic telos is to become a fully grown oak tree. Suggest or assume the
existence, fact, or truth of (something) as a basis for reasoning, discussion, or belief.

We need things to have stability and fixity in order to rely upon them, to trust them,
to begin to depend upon them, to lean upon them. We need language to maintain its
integrity and structure as a system of communication. Imagine a confusing world
where social consensus constantly refreshes the meanings of letters or words, say, on
a daily basis. The need to constantly refresh ones store of names would reduce the
scope and scale of our ability to communicate and what we could and would
communicate about. We need scientific theories and engineering specifications to
also have sustenance and fixity. Thus the more rigid fixed, Newtonian universe
perseveres alongside the more fluid flexible, quantum universe, that oxygen 5 years
ago is believed to remain the same at the molecular level as today otherwise we
could not use it to certain effects, that Genomes also remain stable is also useful and
million other ‘facts’ that are integrated into processes and hierarchies of products and
processes. We need them in order that we know that processes are complete, at least
at some defined [fixed] stage or another. We need them to be stable in order to use
them, to build new combinations, to understand what is new about ‘new’
combinations etc. We need ATM machines to constantly dish out cash, or contactless
payments to work as well as they did yesterday. New software builds often allude to
the notions of stability. Access to new features or functions of new builds, often
entails risk of malfunction, glitches and crashing.

So what we are saying here is that in extreme circumstances, whether physical,


emotional or intellectual, or even in the most mundane ‘everyday’ situations such as
the battery running out on our smart phone, appears that we do not necessarily have a

026463 77
choice or option to change externally generated situations or circumstances. The
‘social fix’ is to make sure that you keep the phone charged. A technical fix would
mean a solution that makes batteries obsolete. A change in how we view our social
environment could also render phones obsolete. A radical new policy or law
regarding how communication is governed would also affect what was
communicated.

Other people, the social, the institutional, governments and government departments,
parents, up line managers, police, courts, landlords, banks, and many others
[including Nazi death camp officialdom] have the power to create situations and
circumstances to which we may be compelled to respond and deal with if we do not
adhere to the rules of engagement. Foucault in The Order of Things offered histories
of institutions, disciplines and vocabularies. They pointed a philosophical moral: that
what counts as ‘science’ and as ‘rationality’ is a matter of rather suddenly formed
‘grids’, ways of ordering things. In these books, he showed us how what counted as
‘medicine’, ‘disease’, ‘madness’, ‘logic’, ‘history’, and the like, changed in startling
ways at various periods. He was trying to exhibit the radical contingency of the
concepts used at a given time, the looseness of fit between what went on and what
people said and did about it. But like all man-made phenomena they are also subject
to breakdown and dissolution, just as our houses and cars will breakdown if they are
not constantly propped-up, monitored, maintained and serviced. They will
breakdown if they are not believed in or held as opinion. Nietzsche’s claim that faith
in ‘science’ is as hopeless as faith in God – that both are forms of ‘the longest lie’.

Nature and our technology present us with opportunities and constraints which
apparently must be addressed. But we could choose to let it ‘roll over us’ or we can
choose to ‘get uptight’ or we can go and search out a means of charging a battery. In
the case of a smart phone running out of charge, out of signal, or out of pay-as-you-
go credit, you could find a public telephone box [ever more rare], in world where
mobile phone diffusion is high, you could ask a passer-by to use their phone, you
could find some means of charging it, getting to a better vantage point for a signal, or
find a retail outlet to purchase more credit. In all to achieve the goal of phoning

026463 78
home, there are options limited by the technology at hand, and by the environment.
In this sense the device and its makers have power over us, at least to affect our
behaviour, emotions and thoughts. The ability to continually think what should be,
what should happen, not only guides us in what we do, but moreover defines us as
persons, and also defines us as intelligent sentient actors, but this can also lead use to
states of frustration and even anger. It also teaches us to check our battery power
before leaving the house, just as losing data offers up a lesson in how we should
regularly back-up our data. Biochemically, arousal by adversarial situations leads to
the release of cortisone and spontaneous behaviours widely regarded as negative
which includes throwing the temporarily non-working phone on the ground so that it
never works again.

Clearly these are incidents of social learning and will carry characteristics of the
society or culture into which they are raised in and immersed. There are a number of
commentators (i.e. Sturat-Kotze, 2006, who suggest that while attitudes and
behaviour may change over time, personalities remain relatively fixed, consistent
during our lives. Personality “refers to individual differences in characteristic
patterns of thinking, feeling and behaving.” 75 Otherwise it is taken to mean “a mix of
values, world-views, set responses and characteristics which are relatively enduring
aspects of the person… Current research indicates that we are more flexible than that
but changing one’s attitudes, values, beliefs and aspirations – the substance of
personality – are difficult.” (Stuart-Kotze, 2009) Stuart-Kotze advocates that
personality becomes virtually fixed at about age five. (Stuart-Kotze, 2009) 76. If this
is true, then coping with, and adapting to, extreme circumstances like coping with
being interned in a Nazi death camp or not having mobile telephony when your car
has broken down in the middle of the countryside, are understandably not easy.
Others have proposed fixity in personality being offset to age 30. However, in
contrast to either of these postulates is a significant body of research that now
proposes that changes should be more varied and should persist throughout
adulthood (Srivastava et al, Blonigen, 2008, Caspi, et al. 2005, Donnellan, M. B.,

75
http://www.apa.org/topics/personality/ retrieved 20/11/2005
76
R. Stuart-Kotze,Performance: The Secrets of Successful Behaviour, Prentice Hall, harlow, UK
2006, p.45.

026463 79
2007)77. These studies do show that the maturational process impact upon certain
traits, such as Conscientiousness and Agreeableness, over others which do remain
more consistent over time.78 Cognitive flexibility, however has a late onset of
impairment and does not usually start declining until around age 70 in normally
functioning adults. Impaired executive functioning has been found to be the best
predictor of functional decline in the elderly. 79 It is also linked to attentional
disorders.

It is clear we are not simply social constructions or socially shaped, open and privy
to any input, from any external source. But then neither are we the centre of the
universe isolated, alienated and separated from all those and that which surrounds us.
While our available choices or alternatives may be, such as being brought up in a
particular culture or society with certain taboos and rules, or under certain economic
conditions, available resources, artifices and devices. These also assimilate into the
overall identity or life of the individual. Child behavioural researchers Kagan and
Lamb (1987)80 suggested that children as young as two years old begin to elicit a
moral sense. That is, values, decisions and attitudes certainly derived from social
knowledge, in particular shared understandings with adults and peers of what is
‘right’ and ‘wrong’ in ways and manners of thinking, beliefs and behaviours. If we
77
J Pers Soc Psychol. 2003 May;84(5):1041-53.
Development of personality in early and middle adulthood: set like plaster or persistent change?
Srivastava S1, John OP, Gosling SD, Potter J
Blonigen DM, Carlson MD, Hicks BM, Krueger RF, Iacono WG. Stability and change in personality
traits from late adolescence to early adulthood: A longitudinal twin study. Journal of
Personality. 2008;76:229–266.
Caspi A, Roberts BW, Shiner RL. Personality development: Stability and change. Annual Review of
Psychology. 2005;56:453–484
Donnellan MB, Conger RD, Burzette RG. Personality development from late adolescence to young
adulthood: Differential stability, normative maturity, and evidence for the maturity-stability
hypothesis. Journal of Personality. 2007;75:237–263
78
A tendency to be organized and dependable, show self-discipline, act dutifully, aim for
achievement, and prefer planned rather than spontaneous behavior. High conscientiousness is often
perceived as stubbornness and obsession. Low conscientiousness is associated with flexibility and
spontaneity, but can also appear as sloppiness and lack of reliability. A tendency to be compassionate
and cooperative rather than suspicious and antagonistic towards others. It is also a measure of one's
trusting and helpful nature, and whether a person is generally well-tempered or not. High
agreeableness is often seen as naive or submissive. Low agreeableness personalities are often
competitive or challenging people, which can be seen as argumentative or untrustworthy.
79
De Luca, Cinzia R.; Leventer, Richard J. (2008). "Developmental trajectories of executive functions
across the lifespan". In Anderson, Peter; Anderson, Vicki; Jacobs, Rani. Executive functions and the
frontal lobes: a lifespan perspective. Washington, DC: Taylor & Francis. pp. 3–21.
80
J. Kagan, & S. Lamb (Eds.), The emergence of morality in young children. (pp. 277-305). Chicago,
IL: University of Chicago Press.

026463 80
return to Frankl’s claim that man (sic): "does not simply exist but always decides
what his existence will be, what he will become in the next moment." In order to
decide what he will become, he must first be aware of the possible options and
limitations. Sometimes they are not always obvious, for instance, when a view is
associated with the opinions of clusters of people that generally share ideological
viewpoints [i.e. politics - when a view, research, or fact is used politically in an
opportunistic way rather than a principled way], or someone is acting with ulterior
motives and hidden self-interest.

Hannah Arendt, writing in 1967, presciently explained the basis for this
phenomenon: “Since the liar is free to fashion his ‘facts’ to fit the profit and pleasure,
or even the mere expectations, of his audience, the chances are that he will be more
persuasive than the truth teller.” An issue becomes politicized when it becomes
subject to public funding or decision making and, therefore becomes, "everybody's
business". Arendt reminded us a half- century ago about the inherent tensions
between truth-telling and political power: “No one has ever doubted that truth and
politics are on rather bad terms with each other, and no one, as far as I know, has
ever counted truthfulness among the political virtues.” Conversely, specific issues
can be groomed to appear as "everybody's business" [and other ideas in media such
as a piece of writing representing the ‘voice’ of the readership]. That is making them
appear to the popular vote and raising the peoples’ heckles. This politicalization is
opposed to non-political means for addressing social issues such as voluntary
consensus, voluntary association and disassociation, or permitting individuals or
smaller groups to make varying choices. All things are subject to interpretation.
Whichever interpretation prevails at a given time is a function of power and not truth.
— Daybreak Nietzsche’s 1880s notebooks also repeatedly state that “there are no
facts, only interpretations.” He argued that truth is impossible—there can only be
perspective and interpretation, driven by a person’s interests or ‘will to power’:
We could make a machine that determines football-ness, but people could argue
that the criteria (algorithm) used is subjective and/or arbitrary. Every football
detector builder would do it differently. Thus, a digital model may not even
qualify as "objective". It is merely a codification of projected, subjective or

026463 81
arbitrary criteria. Repeating arbitrariness in a consistent way does not make it
objective, only predictable. There’s been a lot of virtual ink spent recently on the
various implications of presidential politics in the social networking age. 8182 83

Oxford Dictionaries recently made an entry for post-truth — “relating to or denoting


circumstances in which objective facts are less influential in shaping public opinion
than appeals to emotion and personal belief”. In fact it is the international word of
the year, and for good reason. Pulling upon heart strings and passions in elections is
not new, as is stirring up the emotions of the public, but it seems now to be accepted.
The human mind is not a smart phone camera snapping pictures of things; it is not a
passive neutral mirror. People are always emotionally engaged and influenced when
we perceive, no matter how detached we try to make ourselves. And that is why we
are never in a position to call something an objective fact. In the final analysis, all we
can ever do is sense and interpret.

Within these events were also seeds of the scientific view, to the problem of change
and its causes; and recognition of the methodological importance of applying
mathematics to natural phenomena and of undertaking empirical research.
Recognising the "why" in all things was central to Aristotle’s philosophy. For him it
meant science. Science assumes the universe is empirical, that it operates according
to law-like principles, and that human beings can discover these laws. Further, the
reasons to discover these laws are to explain, predict, and control phenomena for the
benefit of humankind. His preference was to start from the facts given by experience.
Hence he endeavoured to attain to the ultimate grounds of things by induction; that is
to say, by a posteriori (Ltn. "from what comes later") conclusions from a number of
facts toward a universal. This was differed from a priori knowledge Logic either
81
https://www.brookings.edu/essay/covering-politics-in-a-post-truth-america/
82
https://www.washingtonpost.com/opinions/welcome-to-the-post-truth-presidency/2016/12/02/
baaf630a-b8cd-11e6-b994-f45a208f7a73_story.html?utm_term=.328625d91193
83
President Obama nailed it when he said:

"The new media ecosystem means everything is true and nothing is true. An explanation of climate
change from a Nobel Prize-winning physicist looks exactly the same on your Facebook page as the
denial of climate change by somebody on the Koch brothers’ payroll. And the capacity to disseminate
misinformation, wild conspiracy theories, to paint the opposition in wildly negative light without any
rebuttal — that has accelerated in ways that much more sharply polarize the electorate and make it
very difficult to have a common conversation."

026463 82
deals with appearances, or is then called dialectics; or of truth, and is then called
analytics. The people creating this intellectual revolution felt that the use of reason
and logic would enlighten the world in ways that fate and faith could not. The
principal targets of this movement were the Church and the monarchy, and the ideas
central to the Enlightenment were progress, empiricism, freedom, and tolerance.
Freedom in this context draws on Rousseau (1791) and Adam Smith (1776) to
suggest that individual freedom is best achieved through group activism. Emmanuel
Kant identifies enlightenment with the process of undertaking to think independently,
to employ and rely on one’s own intellectual capacities in determining what to
believe and how to act.84

Progress in modernity—and thus the intent of modern knowledge—is focused


on two main arenas: technical and social. The technical project of modernity
is generally the domain of science. In science, knowledge is used to control
the universe through technology. While we’ve come to see science as the
bastion for the technical project of modernity, the responsibility for the social
project is seemingly less focused, at least in our minds today. Generally
speaking, the institutional responsibility for the social project rests with the
democratic state. 85

Does the smart phone in the proverbial forest make a noise independent of anyone
hearing it? Or is the noise made only when someone hears? Neither the ideas of
physics, heartstrings or smart phones work independently of people who use them,
whether they be designers or end-users. There are those that would swear by the fact
that any noise is made only when we hear it. It is a function of the residual sounds
inside our ears resonating with soundwaves from the ring of the phone. Since this is
what they believe, it is, for them, a fact, which conveniently brings us back to
Nietzsche, and in particular his insight that there can be no apprehension of reality
without an active, perceiving interpreting mind. A person speaking into a little box
while walking down the road would not so long ago be considered abnormal, maybe
a sign of mental illness. As beings we are perceivers: we must be in order to even see
and judge something a fact. There is no perception without a human subject - but so
long as the filter of human subjectivity is present we simply lack a basis for calling
anything a fact. That does not mean our perceptions are illusory. This does not mean
84
https://plato.stanford.edu/entries/enlightenment/
85
https://uk.sagepub.com/sites/default/files/upm-binaries/36028_1.pdf

026463 83
that we cannot do science, or that we must become antirealists. It just means that the
principles whereby we are realists; consistency, stability, facticity etc... are
necessarily our interpretations as humans, as cultures, etc... It just means they will
always be infused with meaning. And we cannot extract that meaning out of the
picture. Trivially, if there is a 'real world', being defined as something that exists
outside of our percepts but influences our percepts, then everything in this world is
"truly objective". That is a tautological truth. Such truths are not, as a rule,
particularly useful (unless someone contradicts one... at which point they become
useful for Argumentum Ad Absurdium to reject one premise).

Less trivially, we can recognize that the material and social 'real world' affects our
senses and those of others in our species in some rather consistent manners. As such,
observations made with our senses, especially observations consistent with others -
i.e. empirical observations of any sort - are acceptably, accessibly, and usefully 'truly
objective'. Same goes for any properties derived systematically from our senses (e.g.
probabilities, derived abstractions like 'rocket' or 'speed', etc.). For rockets, these
could include such things as: does it fly? How fast? Did it explode? What is its
historic mission reliability? More concretely, we can recognize that 'objective' is an
English word and communicates a meaning in English that is likely consistent with
what we are taught in grade school and the dictionary. This happens to include
examples much like those for the rocket listed above - speed, reliability, payload, and
various other empirical and derived properties.

We may consider that all “matter”, whether inanimate (a table) or animate (a human)
is an amorphous quantum soup which only becomes “matter” when we observe it, or
touch it, or hear it. Thus you, the reader, and I the writer, are part of this amorphous
quantum soup. What we believe we look like comes only from our observing
ourselves, and what I expect you to understand from reading this.

Donna Haraway in "Situated Knowledges: The Science Question in Feminism and


the Privilege of Partial Perspective" (1988) opens with the argument that when we
talk about objectivity in science and philosophy, we traditionally understand it as a

026463 84
kind of disembodied, transcendent "conquering gaze from nowhere" (p.581) in which
the subject is split apart, distanced from and set above the object of inquiry. She
stresses the embodied nature of all perspective making so as to “reclaim the sensory
system that has been used to signify a leap out of the marked body.” (ibid.) Haraway
argues that this kind of objectivity is impossible to achieve; it is "an illusion, a god
trick," (p.582) and instead demands a re-thinking of objectivity in such a way that,
while still striving for "faithful accounts of the real world,"(p.591) we must also
acknowledge and make explicit our perspective and positioning within the world.
She calls this new kind of knowledge-making as ‘situated’, they are contingent upon
communities rather than individuals, they are located in particular times and spaces,
they are subject to particularised material and functional affordances, Objectivity, for
her, " All Western cultural narratives about objectivity are allegories of the
ideologies governing the relations of what we call mind and body, distance and
responsibility." (p.583Haraway is not only critiquing the idea that objectivity, as we
have long understood it, is possible; she is also arguing that if we continue to
approach knowledge-making in this way then we wash our hands of any
responsibility for our truth claims.

Taking a materialist view we also live and act within environments composed of
matter and material, and we, as human bodies, ourselves, are of course, more than
psychological and social. We are embodied, we are material and monumental, and
we are flesh and blood, nerves, muscles, flesh and sinews. We are movements and
actions, as well as beliefs, predications, and memories. Penfield's Homunculi, whose
cartoon grotesque highlights how our bodies would look if they were propositioned
relative to the surface area of the cortical region dedicated to the control of specific
functions. For example, the afferent sensory nerves arriving from the hands
terminate over large areas of the brain; these results in the hands of the homunculus
are correspondingly massive in comparison to other body areas. This emphasis only
underscores our interest in developing hand eye coordination, and the role of the
hands in making things and controlling devices such as pens, mice and remote
controls, and in other activities such as massage and sport.

026463 85
In contrast, Haraway argues that approaching knowledge-making from an embodied
perspective forces us to take responsibility for our truth claims. As sentient embodied
human actors we can change our response or view of it, our attitude towards it, even
though the situation and our changing of our attitude towards, but as stated this may
not be easy. Perhaps is it more useful in the present discussion to shift to the key
words ‘will be’ in Frankl’s quote above ["does not simply exist but always decides
what his existence will be, what he will become in the next moment."]. What humans
start doing from birth is something computers never have: they do not just learn, but
learn to generalise. Human activities can be described by five high-level conceptual
components: data, prediction, judgment, action, and outcomes. As machine
intelligence improves, the value of human prediction skills will decrease because
machine prediction may provide a cheaper and better substitute for human
prediction, just as machines did for arithmetic. Computers have been far more
effective as facilitators of communication than as decision makers. The only areas
where computers have even made a dent at approaching human-like levels of
decision making are narrow areas where a large number of well-structured options
have to be considered, and the scoring of those options is relatively simple. We know
that people use both their conscious and unconscious minds, both their logical and
emotional facilities to make decisions. That allows us to cope with under- and over-
specified problems, poorly described problems, and problems that need active
exploring before solution. Computers only mimic our conscious, logical selves.
"Computers only mimic our conscious, logical selves." This is not true for machine
learning algorithms. They learn from actual human behaviour, not the rules we
believe we follow. Yes, but a machine learning algorithm is still an interpolation
machine. By definition it cannot deal with a new problem. And in fact, it cannot deal
with any problem for which there is not a large data set to learn from. The weakness
of machine learning is that because it makes no attempt to learn rules, but instead
seeks patterns, its hunger for data is insatiable. Machine learning requires a well-
defined learning space and lots of examples (preferably thousands). There are low
hanging professional fruit where those requirements are met, but many more
examples where they are not. Progress will be slow and challenging. We are very
prone to anthropocentric distortions of objective reality. This is perhaps not

026463 86
surprising, for to instead adopt the evidence based viewpoint now afforded by "big
science" and "big history" takes us way outside our perceptive comfort zone.

Jeff Hawkins takes his place in a lengthy pedigree of thinkers who have addressed
the understanding human potential and functioning, consciousness and behaviour
through the metaphor of the machine and its rational design. Claiming to have
unlocked the secrets of the mind and coded them into machines there is a kind of
chicken and egg ontology peculiar to work in A.I. where the machine and codes are
based upon people and minds and bodies are further understood by building the
machines.

“The assertion that the tortoise, manifestly a machine, had a “brain,” and that
the functioning of its machine brain somehow shed light on the functioning of
the human brain, challenged the modern distinction between the human and
the nonhuman, between people and animals, machines and things.”
(Pickering, p.48)

So thoroughly have such efforts failed that AI researchers have largely given up the
quest for the kind of general, humanlike intelligence that Hawkins describes. There
are many tales differing on how automatons [from the Ancient Greek Talos onwards]
and other beings such as golems were brought to life and afterward controlled. 86

The Greeks maintained in their civilization an animistic idea that statues are in some
sense alive. This kind of art and the animistic belief goes back to the Minoan period,
when Daedalus, the builder of the labyrinth made images which moved of their own
accord. We are social beings and need people around, but increasingly, it seems that
many people are more comfortable dealing with people through machines — through
mobile, messenger, automated check-outs in supermarkets etc. - than in person. As
machines get more intelligent and can better adapt to its "users," people may end up
preferring dealing with machines than with people. Of course, this says something
about who and how we are, and raises wider social and emotive issues such as social
respectability.

86
TALOS was a giant, bronze automoton--a living statue forged by the divine smith Hephaistos
(Hephaestus). According to others he was instead the last of the ancient bronze race of man.

026463 87
Inherent in all these claims lies the future and a certain amount of plasticity and
dynamism regarding the way we approach and cope with the world. Such has clearly
been the foundation for most theories of learning. Most definitions of learning are
taken to involve change in an individual's knowledge and purview, or in their ability
to perform a skill, or in the manner and proficiency that they participate in an activity
with other individuals. Problem-solving and learning are the most essential cognitive
capacities in modern work where there is particular emphasis upon creative and
innovative activities. In the "age of the smart machine", according to Shoshana
Zuboff (1988, p. 309), "successful utilization of intelligent systems requires
maximizing the cognitive capacity and learning ability of the work force". There is
however, considerable variation among the theories about the nature of this change.

After simplistic stimulus-response models of learning, it was not until the late 1970s
that John Flavell and Ann Brown each began to study the nature and place of
metacognition in learners' awareness of their own learning. This includes the ability
to generalise, or learning to learn. In short this represented an ability to reflect on
their own thinking, bound to a capacity to monitor and manage their learning. This
was a point taken up in the mid-1980s with studies of self-regulated learning began
to emerge (see Stephenson, Zimmerman & Schunk, 2001).87 “Metacognition”
essentially means cognition about cognition; that is, it refers to second order
cognitions: thoughts about thoughts, knowledge about knowledge or reflections
about actions. So if cognition involves perceiving, understanding, remembering, and
so forth, then metacognition involves thinking about one's own perceiving,
understanding, remembering, etc. The processes of planning, monitoring, and
regulating thoughts are generally known as executive processes, which involve the
interaction of two levels: At one level is the creative, associative, wandering mind
and above it is the executive, trying to keep it on task.

During the later 1980s and into the 1990s, and with the rise of computer networking,
these cognitive theories were challenged by theories that emphasized the importance
of social interactions and the sociocultural context of learning. The work of
87
Zimmerman, B. J., & Schunk, D. H. (Eds.). (2001). Self-regulated learning and academic
achievement: Theoretical perspectives (2nd ed.). Mahwah, NJ: Erlbaum

026463 88
anthropologists such as Jean Lave began to have a major influence on theories of
learning. Individuals were now seen as initially participating in peripheral activities
of a group (known as legitimate peripheral participation) before becoming fully
integrated into group activities.

Apprenticeship became a metaphor for the way people learn in naturalistic settings.
The notion that people learn by observing others, first articulated in social-cognitive
theory, was expanded in a new context. This was the start of an ethnographic turn in
IT system design, and an interest in contextual aspect of IT use. Many psychologists
and educators currently consider learning to be a phenomenon that is distributed
among several individuals and/or environmental affordances (such as calculators,
computers, and textbooks) or situated (existing or occurring) within a “community of
practice” (CoP - or community of learners). Both a social and a material dimension
are involved in this distribution (Pea, 1993). 88 In such situations, participation or
activity rather than acquisition becomes the defining metaphor (Greeno, 2006). 89 The
evolution from behavioral to social to distributed to situated theories of learning was
accompanied by new conceptions of knowledge (for a good discussion of these
changes, see Schraw, 2006).90

In short the evolution of traditional theories of knowledge shifted over time from a
view of it as some sort of commodity capable of being transmitted, more or less
intact, from one individual to another, to it being socially constructed, and socially
meaningful, integrated as a shared resource of a team, group, company or community
effort. According to the old theories, knowledge is something an individual acquires;
such as when a student successfully learns it, or a device owner reads instructions,
and that he or she can reproduce the knowledge in its original form, without
misunderstandings or errors.

88
Pea, R. D. (1993). Practices of distributed intelligences and design for education. In G. Solomon
(Ed.), Distributed cognitions: Psychological and educational considerations (pp. 47-87). Cambridge,
UK: Cambridge University Press
89
Greeno, J. G. (2006). Learning in activity. In R. K. Sawyer (Ed.), The Cambridge handbook of the
learning sciences (pp. 79–96). New York: Cambridge University Press.
90
Schraw, G. (2006). Knowledge: Structures and processes. In P. A. Alexander & P. H. Winne (Eds.),
Handbook of educational psychology (2nd ed., pp. 245–263). Mahwah, NJ: Erlbaum.

026463 89
A commodity view of knowledge secures its place as the conduit of cultural and
social knowledge passed across not only geographical locations and societies but
through time from generation to generation. But technology transfer - the transfer of
new technology from the originator to a secondary user, especially from developed to
developing countries in an attempt to boost their economies has highlighted that the
notion that some system which worked in one context, or set of contingencies will
work in another without any modifications to wither technical or social conditions of
use, is erroneous (i.e. classic example of village hand pumps in India, Albert Pacey).

Communities and cultures are composed of individuals with common


understandings, shared interests or common goals and these groups provide
opportunities for new members (e.g., children) to construct similar knowledge of the
world through formal (i.e. school or training) and a variety of informal activities.

Knowledge is not immutable, errors may be made in transference, misinterpretation,


and misconceptions, erroneous corrections, and news tastes and fashions lend a
Chinese whisper quality which promotes both deviation and even innovation.

Misinterptations, mistakes lead not only into quirks and outcomes like ‘poor’ design.
They also lead to learning, or as Daniel Dennett would have it – they are the only
avenue to learning: “Mistakes are not just opportunities for learning; they are, in an
important sense, the only opportunity for learning or making something truly new.
Before there can be learning, there must be learners. There are only two non-
miraculous ways for learners to come into existence: they must either evolve or be
designed and built by learners that evolved. Biological evolution proceeds by a
grand, inexorable process of trial and error — and without the errors the trials
wouldn’t accomplish anything.” Dennet goes so far as to suggest that the entire
history of philosophy is based upon mistakes. Mistakes offer diversity.

More recent theories conceive of knowledge as something each learner constructs or


creates afresh rather than something that is assimilated in its preexisting form.
Because each person constructs his or her own understandings, the knowledge they

026463 90
acquire in whole is then unique. This also account for redundancy in knowledge as
the knowledge is not expected – as it is in in school or university exams – simply to
be memorised but rather – applied in the time and space in which it is needed.

Advances were made in neuroscience and how the brain relates to human behaviour
and learning. The study of how the brain relates to learning is in its infancy (for an
introduction to some of the issue, see Bransford et al., 2006). 91 In all, theories of
instruction have shifted to focus upon the students' perceptions, prior knowledge, and
beliefs that determine what and if they learn something approximating the
instructional goals of the teacher. The bottom line in the teaching-learning process is
the learning activities in which the students engage, not the instructional activities in
which the teacher engages.

Another important note here is the work by Rowsell and Abrams (2011) who
conceptualized I/identity for virtual spaces. ... that is “creating new forms of
subjectivity and new kinds of personality” (p. 92). They serve to remind us that
young people's identities are not stable or fixed ... of 5- to 11-year-olds, reportedly
the “fastest growing demographic of virtual world users” (p.

The digital future is not one which prefers or even offers endless limits and
possibilities, but one which is also grounded in the boundaries and realities, the
phenomenological limits of the here and now, and our ability to perceive and adapt.
“But the future is still not here, and cannot become a part of experienced
reality until it is present. Since what we know of the future is made up of
purely abstract and logical elements — inferences, guesses, deductions — it
cannot be eaten, felt, smelled, seen, heard, or otherwise enjoyed. To pursue it
is to pursue a constantly retreating phantom, and the faster you chase it, the
faster it runs ahead. This is why all the affairs of civilization are rushed, why
hardly anyone enjoys what he has, and is forever seeking more and more.
Happiness, then, will consist, not of solid and substantial realities, but of such
abstract and superficial things as promises, hopes, and assurances.”(Watts,
1951, p.)92

91
Bransford, J., Stevens, R., Schwartz, D., Meltzoff, A., Pea, R., Roschelle, J., et al. (2006). Learning
theories and education: Toward a decade of synergy. In P. A. Alexander & P. H. Winne (Eds.),
Handbook of educational psychology (2nd ed., pp. 209–244). Mahwah, NJ: Erlbaum.
92
Watts, A. (1951/2011) The Wisdom of Insecurity: A Message for an Age of Anxiety New York
Random House Inc; 2 edition

026463 91
Watts puts forward that it is future orientated thinking which is the cause of anxiety,
and yet it is the force which Viktor Frankl would suggest also drives us on and
forward. Frankl’s experience in a concentration camp is perhaps an extreme example
of how people cope with life’s adversary. “In the gap between who we wish one day
to be and who we are at present, must come pain, anxiety, envy and humiliation,”
Alain de Botton wrote in his meditation on Nietzsche and why a fulfilling life
requires difficulty. “Nietzsche was striving to correct the belief that fulfilment must
come easily or not at all, a belief ruinous in its effects, for it leads us to withdraw
prematurely from challenges that might have been overcome if only we had been
prepared for the savagery legitimately demanded by almost everything valuable.”

It may seem strange as an opening to a piece on design organisation with particular


regard for new media and digital networks, but it is not. I use it to highlight that there
are limitations to personal control in all domains of life including the production,
design and use of artefacts and all other man-made things. It also highlights that we
can tolerate such extreme threats to our ‘comfort zone’ and that we are more
adaptable than seems to be ‘written into’ or ‘inscribed’ in products and services and
their allowances and idiosyncrasies. Milton Erickson, is another who worked with
this notion of plasticity in the human psyche with regards to interpretation of self and
world. Erickson, known for his brilliance in the art of medical hypnosis fostered a
very different view of the unconscious mind than Freud and other predecessors. He
held that solutions to human problems lie within the person to found in their
unconscious mind, a belief that the unconscious mind was a source of strength and
healing. In his famous theory of "utilization" the person is facilitated to become
aware of the strengths and resources within himself - very much like what Erickson
himself experienced in his struggle with polio developed when he was a child, and
which plagued his health over his life. But he was unique in his Conscious insight
did not, in Erickson's view; lead to a person's changing his behaviour. He found that
"speaking" to a person's unconscious, on the other hand, was very effective in
producing change.

026463 92
“The same situation is in evidence in everyday life, however, whenever
attention is fixated with a question or an experience of the amazing, the
unusual, or anything that holds a person's interest.”

Milton Erikson was not the only practitioner who privileged the experience of the

patient or client in the therapy context. The humanistic psychologist Carl Rogers

perhaps even more famously advocated a particular kind of communication which

ostensibly works by careful and attentive listening then recapitulating and rephrasing

many of the patient's statements. The aim here was to encourage clients to focus on

their current subjective understanding rather than on some unconscious motive (i.e.

such as in Freudian psychoanalysis) or someone else's interpretation of the situation

(i.e. a prescriptive therapist or expert, or even gossip who purports to ‘know’ how

other people think). So listening actively, expressions of feelings and states were

sampled and posed back to the patient [user] as questions. Using Rogers’ own

account of his style (1975) we can understand accurate empathic understanding as

follows:

“If I am truly open to the way life is experienced by another person... if I can
take his or her world into mine, then I risk seeing life in his or her way... and
of being changed myself, and we all resist change. Since we all resist change,
we tend to view the other person's world only in our terms, not in his or hers.
Then we analyze and evaluate it. We do not understand their world. But,
when the therapist does understand how it truly feels to be in another person's
world, without wanting or trying to analyze or judge it, then the therapist and
the client can truly blossom and grow in that climate”.93

The irony of all of this is that perhaps what is the most intimate and humanistic of

communication practices, that of simply listening and responding to another, dealing

with the most humanistic social responses we can make – empathy – is now being

replicated by some ‘cold’ calculating machine, which many have said will never

‘feel’ as humans can do regardless of the sophistication of programming and the

93
Rogers, C. (1975). Empathic: An unappreciated way of being. The Counseling Psychologist, 5(2), 2-
10.

026463 93
power of its processing and memory. Empathy is the ability to understand what the

other person is feeling and refers to the therapist's ability to understand sensitively

and accurately [but not sympathetically] the client's experience and feelings in the

here-and-now'.

We are perhaps more conscious of the limits regarding how much control we have
over natural phenomena in nature. Weather systems do not have USB slots or any
other interface through which we can control them; Tsunamis are similarly not part
of the internet of things (IoT), neither are asteroids. But with ever more powerful
computers we can now model them, if not predict them, and ask ourselves what we
happen ‘ if…’ No matter what we choose, while we may indeed choose our attitude
or even reaction to our smartphone running out of battery or signal, it does not affect
in anyway its use, or lack of it, any more than us taking a positive and proactive
attitude to being washed out of our homes by a cyclone. Some phenomena, a device
or natural phenomena, through the agency of science and technology, we can affect,
we can produce replicable or neat responses and outcomes, but in some cases we
have no choice, no options and must play the passive audience waiting to be spoon-
fed, or shown. It was the philosopher of science, Karl Popper who contrasted ‘clouds
and clocks’. Popper wrote extensively on the problem of determinism and free will,
researched many earlier thinkers on the subject, and formulated his own
"evolutionary" model of free will.

Synthetic that is, material, and environmental conditions created by other human
beings have intended rather than purely found or discovered purposes94.
94
Perhaps I open this discussion with something of a highly provocative statement. The definition of
what is ‘man-made’ and what is ‘artificial’ remains contested. Is a bird’s nest artificial or has it been
built through the craft and agency of merely another species? While it is not debated that we do tend
to set ourselves apart as humans, we also recognize the difference between artificial and natural
habitats in nature. To the first, if we see ourselves as above nature in some sense, it is because we
have found ways to remain "fit" that are many times more complex than anything we've seen from a
non-human species. Also consideration of natural events can be destructive if not even more
damaging that what man has done to the environment. According to Leo Strauss suggests that the
beginning of Western philosophy involved the "discovery or invention of nature" and the "pre-
philosophical equivalent of nature" was supplied by "such notions as 'custom' or 'ways'". In ancient
Greek philosophy on the other hand, Nature or natures are ways that are "really universal" "in all
times and places". Nature has two inter-related meanings in philosophy. On the one hand, it means the
set of all things which are natural, or subject to the normal working of the laws of nature. On the other
hand, it means the essential properties and causes of individual things. Science, according to Strauss'

026463 94
“As soon as we introduce “synthesis” as well as “artifice,” we enter the realm
of engineering. For “synthetic” is often used in the broader sense of
“designed” or “composed.” We speak of engineering as concerned with
“synthesis,” while science is concerned with “analysis.” Synthetic or artificial
objects - and more specifically prospective artificial objects having desired
properties - are the central objective of engineering activity and skill. The
engineer, and more generally the designer, is concerned with how things
ought to be - how they ought to be in order to attain goals, and to function.”

Some aspects of environment, many personal and social circumstances, and some
predicaments, even with science and technology are extremely difficult to change
and particular aspects of a design, or come to that, any other man-made phenomenon
such as a policy, or even social movements and solidarity resist or promote attempts
to change or alter them. Their apparent obduracy shapes actions, practices, or serves
as a problem-space for new innovation (with respect to technology I think of Hughes,
1983). Practices for instance have been described as fundamentally institutionalized
ways of doing things in social organizations (such as in Gherardi, 2006). 95 As such
they are relatively stable modes of organizational reproduction which are subject to
various institutional pressures. They lend it homeostasis stability, the maintenance of
a constant internal environment. This is the conservative streak which while
fashioned perhaps by economic incentives as well as attitudes towards risk, is a
cousin of the sociological theories of social reproduction, the tendency for actions,
everyday practices and routines, and beliefs which reproduce values (i.e. Bordieu).
As relatively stable modes of organizational reproduction and institutionalization,
Gherardi suggests in his paper that practices order heterogeneous elements into a
commentary of Western history is the contemplation of nature, while technology was or is an attempt
to imitate it. My dictionary defines “artificial” as, “Produced by art rather than by nature; not genuine
or natural; affected; not pertaining to the essence of the matter.” It proposes, as synonyms: affected,
factitious, manufactured, pretended, sham, simulated, spurious, trumped up, unnatural. As antonyms,
it lists: actual, genuine, honest, natural, real, truthful, unaffected. Nevertheless some philosophers
assert that, in a deterministic world, "everything is natural and nothing is artificial", because
everything in the world (including everything made by humans) is a product of the physical laws of
the world. Qinglai Sheng, Philosophical Papers (1993), p. 342. Thanks to new tools and swift
advances in our ability to read, write, and edit DNA, we’re gaining a much deeper understanding of
— and control over — how life works. There is perhaps no more fertile ground for this discussion
than artificial intelligence and simulation and human consciousness. Margolin critiques the
artificial/natural binary assumed by Simon and argues that it is exactly at the border of these two
terms that the work of design (and design criticism) is to be done. "When Simon compared the
artificial to the natural he posited the natural as an uncontested term, ..."[4] This, as Margolin points
out, is a highly problematic position for a designer of new technologies, like Simon, to attempt to
defend. Victor Margolin has pointed out that the artificial/natural boundary is continually under
contention and it is there, at the boundary, that the work of design and design criticism is to be done
95
Gherardi S. 2006. Organizational knowledge: the texture of workplace learning. Blackwell: Oxford.

026463 95
coherent set, that is financial and economic, as well as technological opportunities,
and social and cultural innovations can be melded. And so it is with everyday life
and living, its practices can be impinged upon by the introduction of new rules, new
laws, new information, new technology and new economic and competitive
conditions. To be involved or to get involved with making or changing things and
matters is not always valued, primarily due to the risk potential for imbalances to set
in. "All of us lead two parallel lives: the one we are actively living, and the one we
feel we should have had or might yet have. As hard as we try to exist in the moment,
the unlived life is an inescapable presence, a shadow at our heels. And this itself can
become the story of our lives: an elegy to unmet needs and sacrificed desires. We
become haunted by the myth of our own potential, of what we have in ourselves to
be or to do. And this can make of our lives a perpetual falling-short.” (Philips, 96

Somehow or another while there is virtue, even virtuosity in a home-cooked meal,


but then we may be more often than not delighted when we go to a quality eatery, to
dine upon the crafting of a professional chef. D.I.Y. as a practice is often associated
more with tinkering and playing, even when it achieves results, than attaining a
professional finish. Strictly speaking, a professional in some area is simply someone
who is paid to do it. But informally, we think of a professional as someone who not
only is paid for their services, they are also very good at what they do, they are an
expert. This means they know what to do and how to do it, when to do it, and to
which degree to do it. They are well practiced and through this confidence,
familiarity and expectation is developed. But even with professionals and experts,
research has shown that there are limits in any given situation which will be defined
by the capabilities of all the actors in any given participatory process. These are
defined in relation to the affordances of the operating context in which they operate
(Orlinski, Fleck, Proctor and Williams)97 Organizational changes cause disruptions,
incoherence and inconsistency in organizational activities. A major component of
any human environment which seems to act as a catalyst for change if not a ‘force’
that changes is new technology. "We need a revolution instead of a technology
96
Missing out : in praise of the unlived life Author: Adam Phillips Publisher: New York : Farrar,
Straus and Giroux, 2013.
97
Orlikowski WJ. 2002. Knowing in practice: enacting a collective capability in distributed
organizing. Organization Science, 13(3): 249-273.

026463 96
evolution." 98

We must consider that each of us, designer, user, worker, manager, investor, citizen
exists in a lifeworld very much conditioned and characterised by certain permissions,
allowances, and affordances. Some scholars see practices as individualistic, “situated
recurrent activities of human agents” (Orlikowski, 2002: p.253), whereas others
stress the organization’s role in guiding practices (Cook & Brown, 1999). Largely
these are simply matters of sociological scale but it can be important to note just
which outcomes are achieved at each level. For instance, individual practices such as
creative work are also inured by constraints, limitations and controls. We learn of
these as we develop through life, from all our trying, struggling and doing, from
discussion with others, from observations of others, from reflection, from media,
from You Tube. Some of these conditions most certainly come from without, before
being made sense of by the individual and internalised. That is, the socio-cultural,
economic, legal, media and technical worlds we inhabit and our specific interactions
with them make up our lived experience or reality. But some of these permissions
and affordances come and are driven from within - our dispositions, tastes,
imagination, prejudices and predilections and such like which drive us to action. The
dynamic between how and who we are and what opportunities and constraints are
open or available to us define what we can achieve and to a large extent, desire to
achieve.

Although some aspects, structures, institutions, businesses, ways of doing and seeing
things, laws and regulations, seem fixed, cast in stone and permanent, most are in
flux to certain degrees and extents, as are many of our beliefs and opinions.
Homeostasis in this sense is not a passive fixed condition, it is dynamic and sensing.
“It refers to how a person under conflicting stresses and motivations can maintain a
stable psychological condition. A society homeostatically maintains its stability
despite competing political, economic and cultural factors. A good example is the
law of supply and demand, whereby the interaction of supply and demand keeps

98
http://www.wired.co.uk/article/peter-sunde-hemlis-political-apathy

026463 97
market prices reasonably stable.”99 Somehow we, the sentient conscious being, is
sandwiched between that which comes to us from without, and all that which we
share with others, and that which drives and informs us from within, that which lends
us the impression that we are the centre of things, the loci of the action, the executive
function.

I offer this preamble to emphasise the fact that we are psychological and we are
social beings. And this along with these facts, we are also embodied and physical
beings. Mind body dualism was first formally stated in modern philosophy by
Descartes. His famous cogito ego sum ("I think therefore I am") implies that only
through personal consciousness can one be certain of one's own existence. He
identified consciousness with mind or soul, which to him was a substance as real and
as concrete as the substance he called body. Descartes defined body as extended
(space-filling), physical material and defined mind as "thinking thing" (res cogitans),
which was unextended (did not take up space) and was not made of any physical
material, but was purely spiritual. He also posited that these two substances mutually
affect each other, giving the name interactionism to his position. Western philosophy
continued to work upon such musings, Kant in The Critique of Pure Reason (1781)
argues that there can be no doubt that all knowledge begins with experience, and
notes that thinking is ‘awakened’ by people’s interaction with objects, which then
affect the senses. We work with things – whether they be symbols, materials, devices
artefacts and we do so on physical, symbolic and perceptual levels. Cognition is not
an abstract symbolizing process but fundamentally structured by the inescapable fact
that the biological processes constituting ‘mind’ are part of a body which is
constantly interacting with the world. This opens the possibility of new designs and
new ways of looking at and doing things, and lies at the ontological root of how
things change and how we change and how we can change things. Our ability to
exploit opportunities and cope with adversity, mulishness and constraint largely
defines us as persons and contributes to our biographies and life story. But even
material objects can possess biographies. Igor Kopytoff (for instance in Kopytoff and
Appaduri, 1986) drew out how things, innate objects, artefacts, commodities have
99
What is homeostasis? Scientific American: Jan 3, 2000
http://www.scientificamerican.com/article/what-is-homeostasis/ [accessed 21/05/2002

026463 98
biographies as well imbued within them socially as they move and persist through
time and space and circumstances. Coming from the field of cultural studies Johnson
(1986) viewed the diffusion of technologies as a diverse process of semiotics,
meaning-making and representation straddling objective aspects and subjective
beliefs – for instance a house can be a home, an investment opportunity, a
configuration of bricks and other materials, a job, a prison and so forth. Others, such
as Du Gay et al. (1996), have elaborated on this work to explore how meanings
congeal or gather through the transition from early development to diffusion in to
consumer markets, they established a notion of a 'cultural circuit' that design product
travel in their passage to commodification and commercialisation. And the actor-
network perspective of Latour and Callon et al. famously does away with traditional
binaries such as human and technical, replacing them with actors and actants. Freed
of the typical positivist and reductionist obsession of breaking things down into
essential components we can consider social and technical systems and their
development as symbolic and semiotic matters as much as the procurement and
fabrication of nuts and bolts, and wires and screens. We can focus upon outcomes
rather than praise ingenuity and invention. "As its heart, the crucial question is
whether there is ultimately such a thing as the Truth, as opposed to cognitive
constructions creating relationships between coordinates that are always true." (Lent,
2017; p. 353) In other words, facts are not enough. Yes, they often are lacking,
especially in these times of ‘fake news’. But even if science and scientifically-
minded citizens were able to present us with perfectly true facts about the world, our
instinct for patterning would demand that we weave those facts into a web of
meaning.

We can privilege use rather than praise design and its subjects and processes. This is
viewed as perpetuating the development of, and subsequent diffusion of, both ideas
and actual and actualised products between various institutions and actors. 100 More
100
However the du Gay et al. model is a social and culturally-based perspective of products and their
diffusion. It neglects somewhat the concept of individual and pragmatic instances of use. In this
formulation use is somehow sublimated into the concept of consumption. Nevertheless, the elements,
institutions and forces which comprise the model, each plays a distinctive role in shaping one another

026463 99
recently, Proctor and Williams (2010) have addressed this issue in relation to how
the implacability of generic software packages aimed as firms and their practices
integrate or not, often causing upheaval and organisational issues. 101 Pollock and
Williams note the relative “boundedness” of tangible commodities as contrasted to
the malleability of software. In there case studies looking at CMS and ERP system
this is more than a question of the lack of stability in software code and capabilities,
the temporal provisionality of a technological artefact.

Most new products, either of the hardware or software persuasion, even products of
an ephemeral or chiefly artistic nature, such as digital art and video, television
programming and graphics and interface design, are usually designed by
modification to existing products and typically the reconfiguration of materials
and/or components.102 This means that most aspects are in some way familiar.

“How does a designer formulate a role as a change agent and determine a


course of action? To do so means to consider both the past and the present,
which have been embodied or are embodied in concrete activities and
artifacts. From the dialectic of past and present come the situations that
determine the possibilities for the future. To plan effectively in the present
requires a vision of what the future could and should be. I use both the
conditional “could” and the prescriptive “should” to suggest, in the first case,
that the future is always based on the contingency of human choices and, in
the second, to assert that these choices need to be driven by a consideration of
what ought to be done.” (Margolin, 2007; p.5)103

Design according to this idea then is always future orientated as a practice although it

and driving what is has been termed as either ‘evolution’ or ‘revolution’ in technology and business
practice, and ultimately culture and the wider society.
101
POLLOCK, N. & WILLIAMS, R. 2010. The Business of Expectations: How Promissory
Organizations Shape Technology and Innovation. Social Studies of Science, 40, 525- 548
102
‘TRIZ’ is derived from the Russian “teorija rezhenija izobretatelskih zadach” which translated
roughly means the “theory of inventive problem solving”. TRIZ holds that innovation is not a random
process, but governed by a number of ‘laws’ (Gadd, 2011) Souchkov, 1997, Savransky, 2000). The
premise is that the understanding and application of these laws shape the evolution of technology. The
theory is significant as it supports the idea that most inventions are in fact ‘recombinations’ or
‘reconfigurations’ of existing principles. Its originator Genrich Altshuller is said to have studied some
400,000 technology patents searching out regularities and basic patterns governing the process of
solving problems. Reification of these principles was viewed as serving as the source of new ideas and
new innovations, and led to the creation of a systematic process for the invention of new systems and
the refinement or innovation of existing ones.
103
Design, the Future and the Human Spirit Victor Margolin Design Issues: Volume 23, Number 3
Summer 2007 pp.4-15

026463 100
most certainly entails reflection upon all that which exists, and reflexion through
consideration of past and present experiences of things. (Johansson-Skoldberg et al.
2013).104 “drawing on a world … that had not yet been realized” (Lepinay, 2007, p.
270). 105
This draws firms and even individuals to seek out that which they are
familiar with, in other words there is a kind of path-dependency.

Recognition of this consciously or implicitly features in product development where


it involves the steady, incremental evolution of an initial design interspersed by
introduction of new technologies of fabrication and manufacture such as the recent
addition of 3D printers [which draw upon principles from 2D printers], mobile
phones [which draw upon principles from landline phones] with cameras or faster
microprocessors [which relate to those previously developed].106 Others can be
overcome by the exercise of human initiative, ingenuity, learning and persistence,
although not necessarily in that order, but certainly we have to develop and learn the
usefulness of these traits in the design and usage of things. It is not natural or a given
that we produce or use in certain manners, although perhaps in the earlier age of
industrialisation there was more confidence in manufacturing the low-hanging fruits
of the simpler mass-produced commodities such as soap, cutlery, and other relatively
easy to understand everyday items. In today’s diversified markets with information-
intensive products discerning how to better make and design relies on a much
broader set of knowldeges than only your product range. In order to explore this in
relationship to design we need to consider the need for designers to be able to work
with a range of other practitioners in interdisciplinary and transdisciplinary contexts,
which means that the designer must become a negotiator of the constraints imposed
him or her by often conflicting and contradictory arrays of social, economic and
technology agendas. There is increasing onus upon them to learn to evolve and
104
Johansson-Skoldberg, Ulla, Jil Woodilla, and Mehves Cetinkaya. "Design Thinking: Past, Present
and Possible Futures." Creative and Innovation Management, 2013: 121-146.
105
LEPINAY, V.-A. 2007b. Decoding Finance: Articulation and Liquidity around a Trading
Room. In: MACKENZIE, D., MUNIESA, F. & SIU, L. (eds.) Do Economists Make
Markets? On the Performativity of Economics. Princeton and Oxford: Princeton
University Press
106
Clarkson, P. J., Simons, C., Eckert, C. 2004, Predicting Change Propagation in Complex
Design, Journal of Mechanical Design, 126(5): 788-797. Gerst, M., Eckert, C., Clarkson, J.,
Lindemann, U. 2001, Innovation in the Tension of Change and Reuse, Proceedings of the 13th
International Conference on Engineering Design: Design Research – Theories, Methodologies
and Product Modelling, Professional Engineering Publishing, 371-378

026463 101
articulate problems and their contexts before they make attempt to provide anything
that may constitute a ‘solution’.

Maurits Ertsen (2005, p. 148) argues that as both the context and the issues relevant
to any ‘problem’ are necessarily variable, so each design capability has the potential
to result in new technologies or approaches every time it is employed. However, the
danger is that once a specific apparatus has been evolved, ‘this technology starts to
speak its own “language.” It starts to figure as the latest and thus the best solution for
all sorts of problems’, resulting in the capability becoming ‘frozen’ This then calls
for an endlessly mutable and responsive process, one which constitutes a shift in the
design discourse from the detached and rational concept of humanity, to the more
familiar and inclusive concept of people, and ultimately towards a people-oriented
design. It should be made clear that in such a situation, as Elizabeth Sanders observes
in another chapter, ‘new rules call for new tools’. (op cit, 2005, p. 2)

Over the last 30 or so years there has been growing stress to having the dynamics of
people’s behaviour at the centre of the designer’s activity (user and operators). This
is because devices and even simpler products such as food packaging in
supermarkets, has become more information intensive. We can consider the array of
nutritional data which now adorns tins and plastic wrappers. But as ICT devices
claim to meld with everyday life and practices, there is a need for ever more granular
understandings of how people consumer and use and manage their time and spaces.
This has called for more opportunities for collecting and analysing system data, as
well as for ethnographic studies and other interpretative social science methods to
explore people’s experience. The way in which the role of the designer is to act as
one who adds or enhances value, as they are much more involved with the strategic
planning stages of projects and the wider analysis of their purpose and effectiveness
and thus the need for more knowledge than simply market trends, new software, and
new materials. The way in which technological and social change is altering the
conditions in which the designer must operate. The broad gamut of science and
technology studies have not only addressed issues to do with supply, production,
design or manufacture, but have also extended towards exploring the manner in

026463 102
which people learn of, learn to desire, receive, appropriate or use the finished product
(Silverstone). This has extended the discussion of innovation across all the
sociological categories to consider how it is to be fostered at macro levels - i.e.
national and regional systems of innovation (Chris Freeman); at meso- or intuitional
or organisational levels (this has been the main thrust of much of the books on the
subject, the authors almost too many to mention but included Drucker,
Christensen, ); and at the personal; domestic or consumer-user level (i.e. Eric Von
Hippel, Roger Silverstone). It can be considered that these latter contributions extend
that which could be seen as simply marketing or consumer research, or user research
(i.e. usability, user- experience etc.). The difference has largely to do with the core
issues of knowledge generation, in particular the ‘why’ and ‘how’ – namely what
does innovation mean at this level, what measures can be ucc successfully
implemented to foster it at this level, and what are the benefits of the particular
outcomes to all those concerned at the various levels. Research helps reduce doubts
and increase certainty. It helps make informed choices. To conceive the best strategy
to confront a complex problem, we need to go beyond existing models and see the
wider picture, working interdisciplinarily and with intelligence. Intelligence (from
the Latin intus-legere, ‘to read inside’), is the ability to find differences between
things that are normally seen only as similar, and find similarities between things that
are seen normally as only different (Hofstadter). It is the ability to make distinctions,
find connections, particularly between disparate things or apparently isolated events
(Bateson). There are four conditions that must be met to develop useful advanced
research in design: the problem should belong in the design discipline, the methods
used should be a model for the profession; the topic should be socially relevant; the
process should involve the users. Intelligent research is a way to acquire dependable
and useful information. A way to discover answers to meaningful questions by
applying scientific procedures to attain reliable new knowledge (Ary et al) 107.
Researchers are permanent doubters.

We may typify research within the firm as falling loosely between three areas of
prime interest. These in turn can conveniently map to major areas of innovative
107
Ary, D., Jacobs, L.C., Razavieh, A. & Sorensen, C. (2006). Introduction to Research in Education.
Toronto, Canada: Thomson/Wadsworth

026463 103
activity, the first of which would be knowledge produced to innovate new products
(radical or breakthrough) or otherwise add to, develop or adapt existing products
(incremental innovation). This would consist of the appropriation, production and
use of knowledge mainly of a scientific or engineering technology nature, i.e. to help
in the incorporation of new materials and components, tolerances, specifications,
malleability, interoperability and so forth in existing or new products and services. A
big decision here is what can be sourced readymade and ‘off the shelf’, that is,
already ready to be incorporated or needing only slight modifications or adjustments.
This will contrast with what will have to be developed more or less from scratch or
from very basic building blocks.

Structure versus agency

One persistent dichotomy within the social sciences and relevant to this preamble is
the binary of structure versus agency. "Structures" are said to be the objective
complexes of social institutions within which people live and act. "Agents" are said
to be human deliberators and choosers who navigate their life plans in an
environment of constraints. The contrast goes back to the founders of sociological
theory, including Marx, Durkheim, and Weber, with more recent authors on this
trope include Anthony Giddens, Maurice Godelier, and Pierre Bourdieu. What we
were saying at the opening of this work is that this debate need not be relegated to
complexes of social institutions. Taking a materialist view we also live and act
within material environments, and we, ourselves are of course more than
psychological and social, we are embodied, we are material, we are flesh and blood.
Being an embodied intelligence provides allowances for me to interact with the
world, to, say, watch television (a mainly cognitive practice but still requiring
movement of the eye and hand), but not unbridled agency as it may be implicitly
suggested in certain theoretical approaches (agency-structure, actor-networks). I will
not pretend to be simply a disinterested observer and student of consciousness, but a
participant as well. My entire slant on reality is coloured by my own experiences.
Much in psychology and cognitive science have suggested that both the social and
technical aspects of my existence, and others such as my past experience and

026463 104
memory restricts how I may interact or make sense of what I experience.
“People cannot but be directed at the world around them: they are always
experiencing it and it is the only place where they can realize their existence.
Conversely, the world can only be what it is when humans deal with it and
interpret it. In their interrelation, both the subjectivity of humans and the
objectivity of their world take shape. What people are and what their world is,
is co-determined by the relations and interactions they have with each other.”
Verbeek

Our bodies are our interface to the world, to the social, and the physical brain creates
mind, and have an objective and well as subjective existence. “Being present in this
world means being literate in its structures, sign and symbols – this is much a social
process as it is biological.” Kathy Hall, John Wiley & Sons, 22 Apr 2016 “Western
culture is a mediated culture. Mediums, more than direct personal experience, define
people’s world picture. “(Van Der Weel, 2011, p.1)108

Our bodies and sensory gear are our interface to this material world. Our movement,
actions and activity the world conceivably of some interest to those who make and
sell things, or those charged with the responsibility of security. The human factor and
ergonomics often refers to the physical limits of what we can do, what we can see,
hear, touch and feel through our sensory apparatus, and with respect to the limits and
manner in which we can move, reach, press, lift and so forth. Ergonomics or human
factors is defined as: “ . . . the scientific discipline concerned with the understanding
of interactions among humans and other elements of a system, and the profession
that applies theory, principles, data and methods to design in order to optimize
human well-being and overall system performance.” 109 Its focus has three main areas
of interest, physical, cognitive and organisational. How far can we see? What range
of tone or sound can we hear? It can also refer to purely cognitive phenomena as
well. This is more diverse and desperate as we are aware that human thought and
ideation is complex and can cover anything from how much we can remember, to the
style and manner in which we make decisions, to what we trust, believe, love, enjoy
and so forth. Warren Sack (1997) argued that:

108
Changing our textual minds Manchester University Press Manchester and New York distributed
exclusively in the USA by Palgrave Macmillan Adrian Van Der Weel (2011)
109
http://www.iea.cc/whats/index.html

026463 105
“AI investigations into "human nature" are oftentimes reminiscent of older,
modernist, humanistic, philosophical debates about the same subject. For
example, the philosophy of rationalism has been cited in critiques of
"reasoning" in AI, phenomenology in critiques of AI's conceptions of
"perception" and "the body," and romanticism has been named in defense of
"human emotion" over machines' supposed inability to experience
emotion.”110

It is generally understood in the social sciences that the relationship between


structure and agency is less a binary of opposites but rather an ever-evolving
unfolding dialectic. In the simplest sense, it is dialectic where agency and structure
has the ability to influence the other, such that a change in one requires a change in
the other. If structure and agent are considered to be ontologically distinct levels,
then we have a series of difficult questions to confront at this point in the discussion.
For example: Which has causal priority? Are structures determinatives of social
outcomes, with agents merely playing their roles within these structures? How do we
account for taste and prejudice? Social relationships are densely intertwined with
reasons, emotion, commitments, beliefs, and attitudes - the aspects of consciousness
that make up agency and action even though these traits and disposition are often
ignored for the sake of ‘rational’ or ‘disinterested’ positions which privilege a
‘detached’ scientific and positivistic view. The more recent stance deriving from
reflexive social science is to emphasise the very concrete ways in which each of
these traditions succeeds in identifying the agent, the social actor, as both subjective
and objective.

The technology-society dialectic is another ‘co-shaping’ process brought to attention


in science and technology studies (STS) (i.e. Mackenzie and Wajcman, 1999). The
introduction of new innovations changes our world in some way, sometimes very
noticeably as with the introduction of industrial machinery and concomitant
developments of cities, workers homes, schools, working men’s clubs, sewage
systems, labour relations, and other kinds of architectural and institutional entities.
Or it can be quite small and discrete such as a new machine for cutting one’s
toenails. In every case the argument is that no innovation arises in a vacuum, all

110
Design Issues, March 1997.

026463 106
business and technical innovations are remember, largely social in nature [i.e. they
must be meaningful and relevant, they are often based upon the ideas and techniques
devised by others, etc.).

The sociology of Ulrich Beck (in his thesis of the risk society; 1992, 2000), Anthony
Giddens (1991), Richard Sennet (1998) and Zigmund Bauman (2001) identifies
processes of individualization and risk which characterise late modernity and which
have implications for lived experiences and for the ways in which we represent social
divisions, some of which seem almost entrenched. The basis for this change rests
upon more pluralistic present day perspectives on science and technological change,
and the global economy at large. A gross example is that first industrial revolution
was either a huge success or a massive failure, depending on who you ask. On the
one hand, it led us to a world amply supplied by skyscrapers, airplanes, insurance,
cheap food, and factory-made everything, where those of us in developed countries
get to live long and relatively comfortable lives. On the other hand, it wrecked the
environment and left us dependent on resources and systems that are inherently
unsustainable. ‘Dirty’ industry was move to locations where labour is cheap and
where health and safety is less rigorous. The archetypal worker in an advanced
economy used to be a man on a production line or a salaryman in a city office — a
secure, yet repressed, cog in a machine. 111
It relies upon a new global underclass that
fulfils the menial factory work once done by industrialised countries underclass now
that that work has been outsourced overseas. But the latest generation of specialised
labour platforms – the so-called web-driven ‘gig economy’ also raises the spectre of
greater social inequality. We currently live in a world where our mobile phones now
have apps through which providers will park your car (Luxe), buy and deliver your
groceries (Instacart), and get you your drinks (Drizly). These firms suggest a risk we
might devolve into a society in which the on-demand many end up serving the
privileged few. In many countries, key slices of the social safety net are tied to full-
time employment with a company or the government. As a consequence of these
changes, people have increasingly come to regard the social and economic world as
unpredictable and filled with risks which can only be negotiated on an individual

111
http://www.ft.com/cms/s/2/ab492ffc-3522-11e5-b05b-b01debd57852.html#axzz4LTbXcD00

026463 107
level, even though chains of human interdependence (Elias, 1978, 1982 in his works
on civilising)112 i.e. manners, civilities, remain intact.

Efficacy researchers (i.e. Bandura, 1997, Rothbaum et al., 1982, Heckhausen and
Schulz, 1997) 113
have emphasised internal processes of the 'acting individual' in
relation to the external environment An agentic perspective has it that we are not
merely reactive organisms shaped by environmental forces or driven by inner
impulses. People are rather self-organizing, proactive, self-reflective and self-
regulating as times change. This is important as a capacity for human beings to make
choices in the world and commit to action. All interactions with designed artefacts
entails risk of losing personal control, with effects dependent on biography and on
material and social situation i.e. whether one is a designer, investor, salesperson,
consumer or user. According to Carol Dweck, individuals can be placed on a
continuum according to their implicit views of where ability comes from. Some
believe their success is based on innate ability; these are said to have a "fixed" theory
of intelligence (fixed mindset). Others, who believe their success is based on hard
work, learning, training and doggedness are said to have a "growth" or an
"incremental" theory of intelligence (growth mindset). If people believe strongly in
something, if they have a strong conviction regarding their abilities, it inspires them
to take on difficult tasks, put themselves in danger, or even support action that may
not be in their immediate best interest. Individual and collective agency may serve to
reaffirm social order by reproducing norms and existing social relationships, or it
may serve to challenge and remake social order by going against the status quo to
create new norms and relationships. More extreme examples here would include
social uprisings such as the Arab spring which serve as potent examples of action

112?
Bourdieu extended Elias' concept of habitus to include beliefs and preferences, identifying how
objective social structures are incorporated into subjective, mental experience of agents. In this way
objective and subjective are combined, thus resolving the dilemma of a person being either or both an
object and subject. Elias, N. (1978). The History of Manners. The Civilizing Process: Volume I. New
York: Pantheon Books

Elias, N. (1982). Power and Civility. The Civilizing Process: Volume II. New York: Pantheon Books
113
Bandura, A. (1997). Self-efficacy in changing societies. Cambridge: Cambridge University Press
Rothbaum, F. M., Weisz, J. R., & Snyder S.S. (1982). Changing the world and changing the self: A
two process model of perceived control. Journal of Personality and Social Psychology, 42, 5-37
Heckhausen, J., & Schulz, R. (1995). A life-span theory of control. Psychological Review, 102, 284-
304.

026463 108
driven by a belief that was shared across otherwise disparate groups even in the face
of violence and disruption to the homeostasis of everyday life. The power of shared
beliefs to drive behaviour is certainly not restricted to politics or religion, which tend
to carry very strong, if not radical beliefs and views ‒ people also share, or express,
beliefs through their choice of brand, their ‘likes’ on social media, or their ability to
cope with the poor usability or lack of ‘fit’ of certain devices, interfaces and services.

We are also more than psychological and social, we are embodied and we learn
things through tactile manipulation as much as we do by watching, conversing and
listening to others. In the late 17th century, the British philosopher John Locke
noticed a friend's children playing with wooden blocks inscribed with letters of the
alphabet. The oldest child, Locke discovered, had learned to spell by simply playing
with the blocks. Intrigued, Locke described the blocks in Some Thoughts Concerning
Education (1693) ''Locke's blocks,'' as they became known, were the first toys
designed to draw out a child's abilities and interests. There has since been a
succession of such toys. As children many of us played with Lego or other creative
play systems variations including Mecanno, or Connex. These kits, like alphabet
building blocks are finite sets of bits and pieces most of which can connect together
to form wholes in some way. Construction play encourages children to problem-
solve, connect socially, understand and be inquisitive in their play. Such play can be
individual or social. They learn through physical manipulation regarding the
attributes of different materials, they engage their curiosity and imagination, and
experiment with applying concepts and exploring new ways to use physical
materials. They share knowledge and can influence, or even inspire the behaviour of
others. This play also highlights certain cognitive traits, they could show at an early
age that some of us were sticklers for rules; we followed the instructions and put
effort into making an almost identical copy to the picture on the box. When we did
this we felt a sense of accomplishment, similar to those who successfully complete
jig-saw puzzles, or recite poems or play written passages of music precisely. There
will always be a premium in this world given to accurate emulation or recitation. 114
114
This is the basis for the use of evocative objects and physical teaching materials in Montesori
education. The typical technique involves the teacher showing the child how to stack or put together
blocks – the child then emulates and explores the toy. Lessons in subject domains such as maths are
picked up not abstractly or symbolically but tactility as well. So rather than saying two plus two is

026463 109
Meanwhile some of us digressed and choose to make something different with the
pieces given. We moved and assembled according to what we consider our own rules
or maps, or we simply just connected bits, making it up piecemeal, as we went along.
Some of us struggled, some of us were inspired, and some of us gave up, and some
of us were persistent and patient. Some of what we built was better than others for
the purpose at hand.

If what the device, interface, or service offers or does outweighs the problems using
it, if it is something that we cannot obtain or find easily elsewhere we will likely
persist, accommodate or cope for its shortcomings. Carol Dweck notes that
individuals may not necessarily be aware of their own mindset, but their mindset can
still be discerned based on their behaviour. It is especially evident in their reaction to
failure. Fixed-mindset individuals dread failure because it is a negative statement on
their basic abilities, while growth mindset individuals don't mind or fear failure as
much because they realize their performance can be improved and learning comes
from failure. These two mindsets play an important role in all aspects of a person's
life. Dweck argues that the growth mindset will allow a person to live a less stressful
and more successful life. In the Lego example some of us took pride in our ability to
emulate and some of us took pride in our ability innovate.

Some of us realised that our own creations were not of the calibre aesthetically or
functionally of the design laid out in the plans or detailed in the box, but we used our
imagination nevertheless to create something we would not only be happy to show
off, but perhaps more importantly were able and happy to play with. The idea of
showing off is to some extent linked to the notion of curation, a term now applied to
bricolage of web sites, ideas and material artefacts which essentially define our taste.
“Under neoliberalism, every individual is his own capitalist, his own world-maker.
“Freedom” isn’t security in a just society, but the ability to shop - for a healthcare
plan on market exchanges, for primary schooling, for stocks in your retirement plan
(if you’re lucky enough to have one of those). We’re all masters of our tiny, curated

four, or 2 + 2 = 4, or showing how a picture of two black swans and two white swans equals a picture
of four swans together, the child may move two counters from a section of a box marked ‘2’then
another two to place them in a section marked ‘4’.

026463 110
realms.”115 This highlights the social and intrinsic nature of both making and using. It
may be wrong to pen a child as a follower or innovator by this behaviour, but it is
considered as good thing to learn resilience that is the ability to keep going, the
ability to successfully adapt to life tasks in the face of social disadvantage or highly
adverse conditions and to invoke intrinsic motivation and a sense of contentment.
Richard Sennet refers to this in The Craftsman (2008) where he acknowledges a
basic human impulse to do a job well for its own sake. In some ways it links to
Csikszentmihalyi’s notion of flow: a psychological state which occurs when we are
immersed in an activity and lose all sense of time. 116 Although the use of craftsman
does suggest a way of life that waned with the advent of industrial society, Sennett
put forward that that the craftsman’s realm is far broader than skilled manual labor;
the computer programmer, the doctor, the parent, and the citizen need to learn the
values of good craftsmanship today.

We may even take solace that others around us cope or persist as well when they
handle difficult and unwieldy systems. Just because mobile phones run out of battery
power, signal or credit, and perhaps at the wrong moment, we do not discard them
and throw them away (except perhaps in a fit of rage!). We act in the manner that an
amputee or a blind person learns gradually to adapt or substitute, we tolerate or we
do not, we get lost in the minutiae of everyday life and its conductance, or we don’t.
In fact this state of affairs paves the way for entrepreneurs and innovators to seize
upon weaknesses in already existing products and create new ones which better align
with our needs and wants, or operate more cheaply, faster, quietly, safer, loudly, or
come in smaller sizes or better styles. That is, in a sense designs which facilitate or
operate invisibly (i.e. see Norman, 1998, in his discussion of the ‘invisible
computer’)117

The aspects of things that are most important for us are hidden because of
their simplicity and familiarity. (One is unable to notice something – because
it is always before one’s eyes.) The real foundations of his enquiry do not
115
https://newrepublic.com/article/122589/when-did-we-all-become-curators
116?
Csikszentmihalyi M. (1991). Flow: The psychology of optimal experience. New York:
Harper & Row
117
NORMAN. Norman, D. A. (in press: Fall, 1998). The Invisible Computer. Cambridge, MA: MIT
Press.

026463 111
strike a man at all. Unless that fact has at some time struck him. - And this
means: we fail to be struck by what, once seen, is most striking and most
powerful. (Wittgenstein, 1968, p. 50)118

We may dispose of our mobile phone if one which never runs of power, signal or
credit is produced, or one that even has better sound or camera, or style.119

The original position of rational choice theorists such as Goldthorpe (1998) has
emphasised the overriding importance of analysing the conditions or contexts under
which actors come to act from the sociological perspective. 120 But we should add to
this the technical and mediated allowances and affordances which characterise the
modern landscape of everyday life and living. All of these features in any complete
story of innovation, and in instances of the use of new innovations.

In Lego and other creative children’s toy systems like Mecano, or Connex there is a
necessary [designed?] kind of slack in our interaction with the structures and
components which have long made for diversity in what we can do and what can
achieve every day. In a sense they are like an open question in social science
interviewing or survey. The idea is capture as naturally as possible the voice of the
recipient, their purview or opinion, unimpeded by any forecasted reply on behalf of
the researcher. For most of us misfit or precision is not important or required, but
coping with it, or an ability to cope with it would seem to vary between individuals.
Sarah Lewis (2014) notes that: “discoveries, innovations, and creative endeavors
often, perhaps even only, come from uncommon ground” and why this “improbable

118
Wittgenstein, L. (1968). Philosophical Investigations [G. E. M. Anscombe, Trans.]. Oxford,
England: Basil Blackwell.
119
This whole issue of difficult or unwieldy systems is exacerbated as government services including
state benefits, tax, identity and nationality, employment become virtual. It is seductive to employ the
kinds of cost saving that occurs by virtualizing these administrative processes, however the risk of
something going wrong, or the ‘off button’ being switched becomes greater. Societies need to ensure
that the greatest demands to 'take control of their lives' do not fall on those who are the least
powerfully placed in the „social landscape‟ they inhabit. Agentic

1) A social cognition theory proposed by Stanford University Psychologist Albert Bandura that views
"We see the world as agents of change. We believe that we have choice over our actions and we strive
to enable others to make informed, responsible decisions online.

120
Goldthorpe, J. H. (1998). Rational action theory for sociology. British Journal of Sociology, 49(2),
167-192.

026463 112
ground of creative endeavor”121 is an enormous source of advantages on the path to
self-actualization and fulfillment, brought to life through a tapestry of tribulations
turned triumphs by such diverse modern heroes as legendary polar explorer Captain
Scott, dance icon Paul Taylor, and pioneering social reformer Frederick Douglass.
Just like some peoples’ preference for 8-bit music or images, or a preference for the
vivid expressionistic and non-naturalistic use of colour and shapes by artists
embracing art styles such as impressionism, not all that is hi-fi, hi-resolution
photorealistic, not all that perfectly fits, is appreciated. Bourdieu's (1993) emphasis
on social reproduction emphasises subjectivities of the acting individual and explores
agency in relation to 'habitus' and 'field'. The underpinning ontology of lifelong
learning is that there is always something to be learned at any or all life stages and
this applies to devices and services, there is always room for improvements and the
accommodation of new knowledge. This also enters current discussion regarding
participation culture, how ordinary persons create as well as Ida Hattemer-Higgins
describes beautifully the simultaneous creation and consumption of the “curated” self
on social media: “Through Facebook, I had what one might call in Lacanian terms, a
late-onset mirror stage. As my own spin doctor and publicist as well as the single
most important consumer of the brand I was trying to launch, I bought into
myself.”122

We accept that neither "structure" nor "action" can occur without the other as both
happen simultaneously. Nevertheless the Lego example does place stress casually on
the role of the maker, their already held beliefs and knowledge of the material, even
though the user, is to paraphrase Paul Dourish (2001) “where the actions is.” In a
sense however in ideation and early prototyping the user is not typically where the
“action is”, it is the designer or innovator themselves who is where the “action is”.
And this is very concrete in terms of geographical location [workshop, design studio
or lab] or in terms of who is using [typically them themselves are the only users].

121
The Rise: Creativity, the Gift of Failure, and the Search for Mastery Simon & Schuster
122
https://nplusonemag.com/online-only/online-only/facebook-ade/ [For Lacan, the mirror stage
establishes the ego as fundamentally dependent upon external objects, on an other. As the so-called
"individual" matures and enters into social relations through language, this "other" will be elaborated
within social and linguistic frameworks that will give each subject's personality (and his or her
neuroses and other psychic disturbances) its particular characteristics.]

026463 113
This is because the conundrum of production and use resolves in the fact that most
artefacts, device or service, is designed for others teleologically in the future to use.
All effort, from ideation, through prototyping, design, fabrication and manufacturing,
marketing and advertising shaped what was possible and acceptable to use in the first
place, and at every juncture in the discussion of new projects, prospects right the way
through to manufactured, shipped and sold products.

But whatever was designed and produced also reflected their style of working and
other personal traits, their reflexive self-awareness of these. Margaret Archer argues
using Jean Piaget and Maurice Merleau-Ponty that, even before the acquisition of
language and even independently of it, the differentiation of the self from the world
occurs through the embodied engagement with the world. In design this would refer
to the tactile and cognitive expertise of working with components, code, soldering,
screwing, fixing etc. Expertise and mastery begins in infancy and continues over
one’s life. Just like most other traits there is a bit of both sides of the dichotomy –
designer and producer and consumer and user - in most conditions of use; whether
we are experimenter and/or rule follower. As maker or customer, as an example we
expect robust, easy to understand and use devices and services. We expect that there
is a solution to the problem or puzzle. Now returning momentarily to our discussion
of agency structure, and efficacy, how does the rule follower work when a piece is
missing or robustness, and easy to understand and use is challenged?

As a maker, you have to be absolutely sure that if the goal is to build the whole
model that all the pieces are there to begin with. In school, everyone taking the test
should have been exposed to the essential knowledge from personal life experience
(attending the class) or by information made available in the room or in the test
paper. But obvious does not make it knowable. The thrill of such a chase rests upon
knowing that there is an obtainable solution, that the gaps or weaknesses in a
technology system, what Thomas P. Hughes (1983) termed reverse salients, can not
only be fixed but perhaps more importantly, identified in the first place. Hughes
defined the term 'reverse salient' as having its roots in the military where it referred
to a section of an advancing military force that had fallen behind the rest of an attack.

026463 114
Typically, this section was referred to as the weakest point of the attack, a lagging
element that could prevent the rest of the force from accomplishing its mission. Until
a reverse salient could be corrected, the force's progress came to a halt. Reverse
salient then, should act as a focus for innovation activity for new system builders,
and those who seek to holistically regarding what they and their partners are creating.
Nicholas Carr writes about the phenomena of the reverse salient in The Weakest
Link. 123 It can be applied to business strategy as much as technology system building.
Simply put, if you want to succeed in business the identification of a product's
vulnerabilities can point to new business opportunities and markets.

When a solution stops being obtainable, or is not picked up or recognised by the


developers, makers or system builders, the policy makers or exam writers it becomes
a failed event and highly unentertaining. The designer, the teacher, the government
policy and authority becomes an idiot and a bully. Too many novice designers,
teachers, and newly appointed officials are more on an ego trip to see people, more
intelligent than themselves, get frustrated over a puzzle they created. Otherwise a
kind of myopia begins with respect to weakness in design:
“As people become accustomed to a particular product or process, they often
begin to take its flaws for granted - and hence become blind to the possibility
for improvement. That’s especially true of people who had a hand in creating
a prevailing system and thus have a direct stake in its perpetuation.”124

This means that designers have to exhibit, if not ‘force’, if they can conscious
awareness of how their device or service will be encountered, by real people, in real
world situations and address this in design where possible, necessary or appropriate.
Alternatively, managers need ‘third parties’ to check the design upon its usability and
correct functioning. It means benchmarking it against traditional or emerging
conventions so that they can adapt their innovations to at least some of what will be
familiar and recognisable to the new and novice user while also permitting seamless
use by more experienced users. This mitigates the risk of expensive manufacturing
recalls. Changes and incorporations of innovations always create an increased

123
The Weakest Link - A product’s vulnerabilities can point the way to lucrative new business
opportunities. November 30, 2006 / Winter 2006 / Issue 45 by Nicholas G. Carr http://www.strategy-
business.com/article/06403?_ref=http://www.roughtype.com/%3fp=603&gko=8b829-1876-20606092
124
ibid

026463 115
potential failure in the design. 125 The failures can affect the reliability and availability
of a product and can cause profit loss to both manufacturer and user. This is
particularly true in some industries over others. For instance in the automobile
industry there have been many studies which have shown that besides financial
harm, disclosure of product defects (such as recalls) can result in negative abnormal
results on the automakers reputation 126
, with consequent losses in stock market
valuation [8] [9] and product sales [8] [10]127. Thus the need for an objective,
‘third-party’ set of quality and functional system and teams which can check that a
product is up to general industry standards, and works and functions as it is intended.
In the software industries there has long existed the facility of releasing beta tests to
the public domain. Users then use, and then feedback to the developers any faults or
defects in expected performance

A design theory of constraints

Often we consider ‘constraints’ to mitigate possibilities and therefore by extension


creativity and opportunities to innovate. "If design can be considered "the conception
and planning of the artificial" then its scope and boundaries are intimately entwined
with our understanding of the artificial's limits." -- Victor Margolin 128 However, like
the rules to a card game which lends it its structure, characteristic and identity
constraints can stimulate creativity and is the key to understanding complexity
(Kamarudin et al. 2016) 129.

But while it may be that the number of things we can do as individuals is potentially
125
Chao, L. P., Ishii, K. 2007, Design Process Error Proofing: Failure Modes and Effects
Analysis of the Design Process, Journal of Mechanical Design, 129(5): 491-551.
126
Rhee, M., Haunschild, P. R. 2006, The Liability of Good Reputation: A Study of Product
Recalls in the U.S. Automobile Industry, Organization Science, 17(1): 101-117. [8] Bates, H.,
Holweg, M., Lewis, M., Oliver, N. 2007, Motor vehicle recalls: Trends, patterns and emerging
issues, Omega, 35(2): 202-210.
127
Barber, B. M., Darrough, M. N. 1996, Product Reliability and Firm Value: The Experience
of American and Japanese Automakers, 1973-1992, Journal of Political Economy, 104(5): 1084-
1099. [10] Haunschild, P. R., Rhee, M. 2004, The Role of Volition in Organizational
Learning: The Case of Automotive Product Recalls, Management Science, 50(11): 1545-1560
128
Victor Margolin, "The Politics of the Artificial." Leonardo 28 (5 1995): 349-356.
129
Khairul Manami Kamarudin, Keith Ridgway, Mohd Roshdi Hassan, Modelling Constraints in the
Conceptual Design Process with TRIZ and F3, Procedia CIRP, Volume 39, 2016, Pages 3-8, ISSN
2212-8271, http://dx.doi.org/10.1016/j.procir.2016.01.034.

026463 116
infinite, it often appears to makers that the number of building blocks is finite, as is
the way they can connect or be connected in order to copy something or make
something new. At the most gross level this could to atoms, or compounds, metals
and where the world has a finite supply. And this is true when it comes to operating
and exploring the navigation possibilities in interactive forms of television or media,
or playing around with an application or a machine.

“Designing is seen as the exploration of alternative sets of constraints and of


the regions of alternative solutions they bound. Designers with different
objectives reach different solutions within the same set of constraints, as do
designers with the same objectives operating under different constraints.
Constraints represent design rules, relations, conventions, and natural laws to
be maintained. Some constraints and objectives are given at the outset of a
design but many more are adopted along the way. Varying the constraints and
the objectives is part of the design process.” (Gross, 1986) 130

As a developer, product designer or manufacturer, it makes simple economic sense to


recycle that which you have developed already. To repackage what has already been
produced, all be it say, in a more attractive manner, or one considered more
fashionable.131 This has long held commercial sway in a design orientated world. A
‘university of creative technology’, a business and design school I worked for in
Asia, had a demonstration gallery showing how agricultural produce such as rice,
sold loosely in sacks in traditional style markets, increases greatly in value as it
comes to neatly packaged in designer cardboard boxes which most of us would
recognise as familiar on modern supermarket shelves. How things look and are
presented, even where they are placed on supermarket shelving are now the subject
of exhausting studies which now draw upon neuro-imaging, and painstaking analysis
of observations of consumer behaviour.132 They shift emphasis from quantity and
quality, to ‘look’. Design and use of designed artefacts and products is a way of
reasoning and making sense of things that is ultimately involved in the creation of
meaning, as much as it is an analytical tool or practical method for solving problems.

130
Gross, M. (1986) "Design as exploring constraints," Unpublished MIT PhD dissertation
http://hdl.handle.net/1721.1/15036 25/01/2010
131
Consumer electronics have a kid of fashion which saw a proliferation of metal or chrome casing
replaced by matt black, ornate versus minimal and so forth.
132
The way the brain buys Retailers are making breakthroughs in understanding their customers’
minds. Here is what they know about you Dec 18th 2008 http://www.economist.com/node/12792420

026463 117
Just as limitations and allowances provide the physical constraint to making in terms
of quality and quantity, they also characterise products in every way as much as
intentional design features including ‘look’. Constraints of capacity, tolerance and so
forth provide an important frame to the exigencies of design and production, as does
economic and strategic elements, and of course to consumption and use. But if we
take a generic view of the automobile, we can see a wide variety of ‘looks’, different
cars powered by an internal combustion engine. Again, for the business person
creating the next new thing with the minimum of investment in labour and new
components, but just a new ‘look’ is the most ideal situation. It is a golden loom.

There has been something of a moral crusade since the age of Silicon Valley neo-
liberalism regarding individualism and individual expression - that we develop our
own styles, our own means of expressing ourselves. It would seem that to do
anything else reeks of plagiarism and copying and betrays the god given freedoms
which we have been endowed.133 But regardless of the evanescent battle cries for
more creativity and innovation from each of us, rule follower or game changer, met
the hard limits of both the quality and quantity of the pieces, or the opportunities we
have had to hand. In some cases our hands are tied not only by economic
considerations, but by regulatory concerns and licencing, although ‘open source’ and
‘participative culture’ has made significant inroads to opening development to a
much wider cadre of persons than ever before.134 Digitalisation and ready availability
of digital tools and online applications have also contributed significantly to lowering
barriers to entry. Online courses in many subjects should lay the foundation for a
more equitable society in terms of cultural capital if not perhaps directly in terms of
income. This situation has even impacted jobs with the notion of the

I propose that design opportunity lies in the slack. In the way in which current
devices, services and ways of doing business do not really fit individuals, are not
really customisable, are not really smart. They still aim to ‘satisfy’ homogenised
mass markets, largely comprised of homogenised individuals. To create a generic
133
Some have suggested this is the influence of Ayn Rand
134
However the ‘sharing economy’ has been hit by more truculent realities as of late.
http://www.salon.com/2016/03/27/good_riddance_gig_economy_uber_ayn_rand_and_the_awesome_
collapse_of_silicon_valleys_dream_of_destroying_your_job/#topOfPage3

026463 118
opportunity, i.e. satisfy the need for transport, satisfy the need for communication
over distance, and satisfy the need to power machinery and control mechanisms
which aim to satisfy even more granular needs and requirements, or even shifting
needs, including those for health, wealth, food, shelter and entertainment. The
tendency today is clear - to consider perpetual optimisation of business processes,
logistics and interface design. Continuous improvement – what quality circles in
Japanese manufacturing termed Kaizen - has a legacy as a management concept
which suggests that all members of an organisation consider from their working
perspective how things can be positively developed. The onus is to compete upon
quality and not just quantity. In a sense, during a time when there were no ‘internet
of things’ or smart sensors, it essentially used employees as ‘intelligent sensors’ or
‘probes’ to report upon, and, where relevant, where processes or materials could be
improved. In a sense they were an in-firm facility which could be viewed as
correlates of antenna shops used in technical market research (TMR), another
Japanese organisational innovation which customers frequently need assistance in
knowing what they want or might need, especially since they are unaware of the
array of new products that are still in the R & D labs. The prevailing view was that if
prospective customers could ‘see it and experience it’, they could then better decide
if they needed the new product. Thus, creating the experience helps to create the
need. Towards this end, technical-market research is an innovative, technology-
driven process to understand the mind of the consumer, so that the ‘voice of the
customer’ can be better translated into new products by designers, engineers, and
marketers. The use of planning processes such as technical-market research and
‘antenna shops’ were aimed to encourage innovative, collaborative ways for R & D
organizations to become a higher value-added resource in the customer satisfaction
process, resulting in more timely actions with fewer ‘false starts’.135

The prevailing view currently is that ‘big data’ is the means through which to
uncover hidden patterns which can guide a plethora of design and strategic decisions
with the ultimate aim of increasing sales, customer satisfaction and profitability. It
follows the general path laid down by the special sciences as they attempted to utilise
135
Antonio S. Lauglaug, (1992) "Why Technical‐Market Research?", Journal of Business Strategy,
Vol. 13 Iss: 5, pp.26 - 35

026463 119
techniques from the natural sciences. According to Herbert Simon: “The central task
of a natural science is to make the wonderful commonplace: to show that complexity,
correctly viewed, is only a mask for simplicity.” (Simon, 1987; p.1) The real world
of choices and movements, recorded for so long has seemed chaotic but can now be
smoothed by statistical sampling analysis of button pressings and navigation, and
billions of sensors in the internet of things. Where data sampling does not occur,
other techniques, some of which may now be considered as traditional, i.e. focus
groups and ethnography and more sophisticated and deep methods drawn from the
social sciences such as ethnography, will fill in the gaps. This is the virtue and place
of qualitative exploration of experience, exploring the subjectivity of the user or
consumer, or ‘get closer’ to them, to more fully understand what can be offered or
how existing goods and services can be improved.

And more than this, building a model house with Lego did not have the resolution of
a craft model or the kinds of detailed miniaturised model that we see in architects’
offices. Whatever you build is ‘low-fi’ ‘blocky’, ‘approximate’, ‘8-bit’ - it is not a
detailed perfectly scaled representation, but more like a grainy or pixelated image.
But like when one tries to make sense of a degraded or imperfect image, a
Rorschach's Blot, when one’s mind and perception accommodates and compensates
for the lack of detail, gestalt or coherence, so it is when we consider the
shortcomings of all sorts of everyday designs, whether this is devices, interfaces and
prototypes, and even the conditions and situations in which they will be used. There
is a kind of slack which is granted, a miss-fit, and ability to adjust or compensate or
accommodate for design flaws or weakness in function and operation. This human
factor is important as it is the space in which innovation can happen. The difficulty
arises when the system becomes over rationalised, over produced, over designed. We
are familiar with the term social engineering to refer to an act of psychological
manipulation. For instance the creation of a simulacrum of ‘free choice’ which is in
fact fostered, created and managed for the purposes of sales, advertising, and
achieving ulterior motives or ends.

The idea owes much to the development of sociology as it turned to scientific theory

026463 120
at the turn of the 19 th to 20th century. It was expressed by German sociologist
Ferdinand Tönnies in his study The Present Problems of Social Structure, where he
proposed that society can no longer operate successfully using outmoded methods of
social management. To achieve the best outcomes, all conclusions and decisions
must use the most advanced techniques and include reliable statistical data, which
can be applied to a social system. According to this, social engineering is a data-
based scientific system used to develop a sustainable design so as to achieve the
intelligent management of Earth’s resources and human capital with the highest
levels of freedom, prosperity, and happiness within a population.

But what we consume and use is not only social in nature, they are not only soft
ideas and concepts, they have shape, they have mass, they have features, and
characteristics. Of course removing the social element from Facebook or the phone
network would render it anodyne and innate, useless, but using also involves actions
and doing things in a characteristic manner. If we follow this proposition then the
physical constraints and limits of things not only affect design and production, but
also consumption and use at all levels and stages. In a gross sense scarcity refers to
the basic economic problem, the gap between limited – that is, scarce resources of
designers and producers, in terms of expertise, manpower, technology or even raw
materials and energy, components and the theoretically limitless wants of consumers.
As consumer and user we may have clear idea of what we want but it is simply not
possible, available, or not affordable at the present time. There is a paradox inherent
here. If it is that the more rare and uncommon a thing, the more people want it, what
happens in the case of a new, radical innovations that hardly anybody, bar what a few
‘lead users’ knows about courtesy of a blog written by someone else ‘in the know’?
We know the familiar examples such as the holiday season frenzies over the latest
faddish toy or the freezing urban campers waiting overnight for on the latest iPhone.

We even know the bargains in the post-Christmas sales which can include that
faddish toy or iPhone. But this situation requires that firms, designers and producers
and all others involved with making things, not only allocate resources efficiently in
order to satisfy basic needs and as many additional wants at possible, but that

026463 121
marketers, advertisers and public relations also perform their role in working out the
best possible means and channel through which to introduce the product and make
the product known. Whether in music or movies or television or books, digital
technology has given artists the tools to strike out on their own, enabled audiences to
avoid paying for anything they don’t want to pay for and denied media companies
the ability to control audience behaviour.

Debate over fundamental meanings, or essential definitions, or connections between


seemingly unrelated phenomena are the elements of cognition persist where
consciousness challenges concrete reality and vice versa.

The persistence of needs and issues and the impetus to make

Instead of ideas today for the large part we have "issues," a discursive mechanism
which is leveraged to convince us that things never really change, perhaps like the
basic human need for shelter, procreation, water and food. Basic needs are always
there. In rich affluent societies they may be buried under a mountain of other
interests, issues and needs and wants, but the basic needs are still there. The digital
society is built upon the industrial society, which is built upon the agricultural
society and so forth. But because an issue has two sides to it, both sides will still be
there whichever one prevails. When Archimedes said, "Give me a lever that is long
enough, and I will move the world," he was talking about how you can think and feel
your way into a new actuality. Limited by our imagination, beliefs, attitudes, our
formal and informal education, our momentums, our political will and ability to act
or do and of course the base material concerns suggested above, is fuelled by what
we desire, we consume, we collect and create, perhaps in that order. Do designers
design things, objects, artefacts or devices that they themselves would not desire or
buy? The answer is of course yes. They can work for business owners and clients. Do
producers of goods and services make things that they themselves would not, could
not, buy or use themselves? Do consumers buy things that they themselves did not
design? Again, banal as it seems, the answer is yes. I state the obvious here to
emphasise that the relationship between design and production and consumption and

026463 122
use is not symmetrical. However it is also a truism that designers and producers of
one type or product or service, are themselves also consumers and users of other
design products, even similar design products [i.e. cars] made by others in some
other place. Japanese cars do not look oriental, and like cars arising from other
locales they have taken on a more generic, global, universal appearance. This is of
course at some level the effect of globalisation and globalised markets. It can be
difficult today to judge the origin country of manufacture based upon appearance,
taste or even style of a given product. Greece may remain the largest consumer of the
traditional Greek delicacy of Feta cheese. However, other European countries -
including Denmark, France, and Germany – far surpass Greece in worldwide Feta
cheese product exports.136

Our imagination and ability to act is shaped and linked, constrained, by our on-going
interaction with existing everyday objects, objects of desire, institutions, various
media and information and communications technologies (ICTs) and their messages.
It is also shaped by everyday interaction with other persons, their products and their
institutions, the physical and its affordances, and of course the built and natural
environments and forces that we all must encounter. The products of other people
and institutions can be viewed as communications of various sorts, and they can take
vastly different forms – physical artefacts, prepared and built environments, physical
media devices, language in its various modes, or they can be responses, actions and
behaviours. Most broadly communications can be anything from a phone call from
your mother, to a brand logo, a symbol to a bill being presented in a restaurant or on
a web page, to the building layout we currently inhabit; to the manner in the way you
walk, or what you make of how another is speaking to you. All of this is seamlessly
woven together to create not only our worldview, Lebenswelt, but also the terms and
conditions of life.

For Jurgan Habermas, communicative action is governed by practical rationality –


ideas of social importance are mediated through the process of linguistic
communication, and according to the rules of practical rationality. The subject

136
http://www.wipo.int/ipadvantage/en/details.jsp?id=5578

026463 123
matter of practical rationality is mainly concerned with the present, in particular,
action, it can be abbreviated to concerns such as ‘what are we to do’?
Comparatively, theoretical rationality is more to do with the past and future
explanation and prediction.137 Technical rationality governs systems of
instrumentality, like industries, or on a larger scale, the capitalist economy or the
democratic political government. Ideas of instrumental importance to a system are
mediated according to the rules of that system (the most obvious example is the
capitalist economy's use of currency). Related ideas would include the semiotic
theories of Uexküll and Sebeok’s umwelt ("environment" or "surroundings") referred
to as the "biological foundations that lie at the very epicentre of the study of both
communication and signification in the human [and non-human] animal". This term
is usually translated as "self-centred world". Uexküll theorised that organisms can
have different umwelten, even though they share the same environment. This would
bolster notions of individual and social realities. The world as immediately or
directly experienced in the subjectivity of everyday life, can be sharply distinguished
from the objective “worlds” suggested in the hard sciences and engineering, which
employ the methods of the mathematical sciences of nature; although these sciences
originate in the life-world, they are not those of everyday life. One other concept,
central to interpretative social science and with a pedigree going back to Kant, Hegal
and others is worldview. These complex meshes of cultural influences lend us
comparisons of religious, philosophical or scientific worldviews, discourses and
arguments. Worldviews start from different presuppositions and cognitive values.
Clément Vidal has proposed metaphilosophical criteria for the comparison of
worldviews, classifying them in three broad categories:

 objective: objective consistency, scientificity, scope


 subjective: subjective consistency, personal utility, emotionality
 intersubjective: intersubjective consistency, collective utility, narrativity

In the modernist perspective we can see the march towards the objective, the proven,
the rational, the societal, the mass, the homogenised at the expense of subjective, the

137
Wallace, R. Jay, "Practical Reason", The Stanford Encyclopedia of Philosophy (Summer 2014
Edition), Edward N. Zalta (ed.) [source: http://plato.stanford.edu/archives/sum2014/entries/practical-
reason/]

026463 124
diverse, the emotional, the self, individual. While Leo Apostel and his followers
clearly hold that individuals can construct worldviews, other writers regard
worldviews as operating at a community level, or in some unconscious way. For
instance, if one's worldview is fixed by one's language, as according to a strong
version of the contentious Sapir–Whorf hypothesis, one would have to learn or
invent a new language in order to construct a new worldview. According to Apostel,
a worldview is an ontology, or a descriptive model of the world. It comprises six
elements:

 An explanation of the world


 A futurology, answering the question "Where are we heading?"
 Values, answers to ethical questions: "What should we do?"
 A praxeology, or methodology, or theory of action: "How should we attain
our goals?"
 An epistemology, or theory of knowledge: "What is true and false?"
 An etiology. A constructed world-view should contain an account of its own
"building blocks," its origins and construction.

Worldviews, lebenswelt, are also built not only from the collective, social,
individual, world but what is made sense of, the umwelt of television or a computer
or digital media device.

"The printing press, the computer, and television are not therefore simply
machines which convey information. They are metaphors through which we
conceptualize reality in one way or another. They will classify the world for
us, sequence it, frame it, enlarge it, reduce it, argue a case for what it is like.
Through these media metaphors, we do not see the world as it is. We see it as
our coding systems are. Such is the power of the form of information."

There is always a correspondence of that which comes to our senses and cognition
from without, and that which we reflect upon and deliberate upon within. But there is
also much more information in the world which we may not register consciously or
deliberate or reflect upon. Some of these communications and interactions within
this arrangement are intentional, i.e. they are made or committed to purposively,
intended by whoever produced, acted, designed, planned, uttered, authored,

026463 125
illustrated, broadcasted or showed. Some are unintentional, they are a by-product
such as an instruction given by one person to another on the mobile phone overheard
by a passer-by who makes a value judgment on what they hear, or the flickering light
produced by a television set in someone’s room see through the window on a dark
night. Some things we pick up very consciously, such as when we go fact finding on
Google, some things are coincidental, such as looking up a book in the library, but
finding a more appropriate book on the adjacent shelf or the next link in the Google
search. These are the ‘background’ knowldeges which frame and provide for
meaning. Some things are obvious or ‘in your face’ and some liminal and some are
subliminal. Some things that provide context have been created or introduced as
such; others are there or appear serendipity, by chance operation, or by default. All
coincide and coalesce with memories to furnish, make and lay the interpretative
foundations of what we experience. “Phone,” “mat,” and “sitting” are items which
perhaps are not as simple as they seem. If we strip these words, these concepts, the
mental images evoked or by these words and concepts, if we strip them of their
relationship to real-world objects, and all of the complexity that entails, dooms the
project of making anything resembling a human thought and thus an experience.

Communications and action, conscious and unconscious generation, use, now


implies the generation and realisation of data, and data in the commercial sense,
implies tendencies and trends. To find results from exploring the data, to discover
relationships, there must be values and rules made, structures, programs and policies
that, like an individual’s background knowledge, frame and lend meaning to the
numbers.

In this sense we can begin to see that data and its interpretation can be linked to
notions of a hidden self, an unconscious mind, a light upon the Freudian darkness
from where within true feelings and motivations dwell up from and should be
sublimated. The unconscious mind, understanding it and how it can be manipulated,
has been something of a preoccupation not only for mental health professionals, but
for some time those in power and in fields such as advertising and public relations. 138

138
Edward béarnaise regarded as the father of public relations is Sigmund Freud’s nephew.

026463 126
The idea of manipulating and modulating public opinion, motivation, beliefs and
attitudes, has since ancient times been understood as a major underpinning
component of human communication and rhetoric. It perhaps reaches its apotheosis
in notions of brainwashing, political manipulation, and propaganda; however, less
dramatic forms of persuasion also characterise everyday conversations. Applying
understanding of why someone clicks or why they retweet requires you to look at the
underlying cognition, the possible contexts, and consideration of neurobiology. To
understand persuasion and social media influence, to get at the heart of conversion
and likes, it becomes imperative to understand how your audience thinks and feels.

They could also be the more optimistic collective unconsciousness of Jung. This was
viewed as shared structures of mind, shared at the level of species. The human
collective unconscious for Jung was populated by instincts and by archetypes:
universal symbols which cut across cultural conventions. For instance one would
find correlates between superstitions, religions and their material and pedagogical
components between Islam, Buddhism, Christianity and Hinduism. Jung contrasted
the collective unconscious not only with the personal unconscious but also the
collective conscious, the sociologically set systems of shared beliefs, ideas and moral
attitudes which operate as a unifying force, social glue within a society. This has
been a prevalent idea in sociological theory up to the structuration theories of
Anthony Giddens, but has a pedigree extending at least back to Durkhiem: “The
totality of beliefs and sentiments common to the average members of a society forms
a determinate system with a life of its own. It can be termed the collective or creative
consciousness.”

Materially, current science evolves in two ways: discovering new elements (smaller
particles, new materials, etc.) the most basic practical units being "atoms" and "rules
of the game". Or combining elements and obtaining new technologies, or improving
existing ones. We can call this "playing with atoms" or innovation. The notion of
supervenience is an ontological relation that is used to describe cases where (roughly
speaking) the upper-level properties of a system are determined by its lower level
properties. So the manner in which building blocks are played with, the manner in

026463 127
which materials, atoms, components, etc. are connected or put together to create new
substances and new innovations are bound by the way in which individual scientists,
engineers and designers frame problems. This framing is contingent on their
experience and expertise:
At any given time in the life of a profession, certain ways of framing
problems and roles come into good currency. . . .they bound the phenomena
to which [professionals] will pay attention. Their frames determine their
strategies of attention and thereby set the directions in which they will try to
change the situation, the values which will shape their practice.” (Schon,
1983, 309-313).

There are many ways to skin the cat and there are similarly many way to achieve the
same ends in everyday life. We can clean clothes by taking them to a local river, or
we can use a washing machine. We can visit each other in offices, in cafes and in
homes, or we can speak on the phone or over the internet. At the time of writing
android TV boxes are providing access to some 1181 online TV channels. 139 The
numbers seem to suggest it too: From 52m smart TVs sold in 2011, sales rose to
141m last year - and that figure is set to increase to 173m in 2016. So to paraphrase
the opening sentence, the number of explosions, romances, break-downs, comedy,
horror, documentary, facts, fiction or propaganda we can see on television is
potentially infinite; the amount of time we have view all this is, however, limited. To
get some scale of this the Global Information Industry Center (GIIC) at the
University of California, San Diego, has been tracking ICT use and consumption and
estimated that in 2008 the average American consumed 100,500 words and 34
gigabytes of information for 11.8 hours on an average day. This contrasts with 1980,
where Americans averaged 7.4 hours consuming information. In the study television
was still the dominant media. "Information" is defined as flows of data delivered to
people and is measured in the bytes, words, and hours of consumer information.
Reading, which was in decline due to the growth of television, tripled from 1980 to
2008, because it is the overwhelmingly preferred way to receive words on the
Internet? Despite the rise of the Internet as the dominant source of information in
two-way communications, the Web as a source of information still trails far behind
TV. Based on bytes alone, however, computer games are the biggest information

139
http://wwitv.com/ {accessed 30/7/2016)

026463 128
source totalling 18.5 gigabytes per day for the average American consumer, or about
67 percent of all bytes consumed. Approximately 80 percent of the population plays
some kind of computer game, including casual games such as Bookworm, Tetris and
social networking games.140

In modern data systems the only layers whose workings people can readily discern
are the input layer, where data is fed to the system, and the output layer, where the
work of the other layers emerges to be reported to the human world. As potential
data streams become more profuse it remains problematic to discern what is going to
happen, or even why something happened in the past. The complexity of such system
is ‘emergent’ and depends on complex interactions between millions of parts, and
individual human beings do not know how to make sense of that. Humans did not
make the same categorizations. But when shown what the machine had done, people
could see the connection between image and concept. Unlike the static-is-a-cheetah
type of judgments, these kinds of machine judgments could lead people to see
strawberries in a new way, or think about the category “strawberry” differently.

The proposition is that technology enables, open options, curates, constrains and
channels. It curtails, enables and shapes what designers, manufacturers, film
directors, television producers, editors; authors can realise, craft, present and offer.
And in concordance with individual or group will, what we can do, what we are able
to do, what we even wish for or desire to be able to do, is inspired, as well as limited,
by our senses and bodies or what is referred to as our embodiment (i.e. Dourish,
2001). Norbert Weiner in The Human Use of Human Beings (1954) laid out his
arguments in favour of automation. He analyses the meaning of productive
communication and discusses ways for humans and machines to cooperate, with the
potential to amplify human power and release people from the repetitive drudgery of
manual labour. He viewed that the space created in people’s days would be spent on
more creative pursuits in knowledge work and the arts.141
140
http://ucsdnews.ucsd.edu/archive/newsrel/general/12-09Information.asp
141
In this he echoes In 2014, serious voices from Pope Francis to Thomas Piketty, in his book Capital
in the Twenty-First Century, have lamented ever-widening inequality. Others have expressed concern
that “the second machine age“ of digital technologies will entail the massive elimination of jobs..

UBI cures the disease of most social ills once and for all and does not just treat its symptoms. The

026463 129
"It is the thesis of this book that society can only be understood through a
study of the messages and the communication facilities which belong to it;
and that in the future development of these messages and communication
facilities, messages between man and machines, between machines and man,
and between machine and machine, are destined to play an ever-increasing
part." (p. 16)

He also made an observation regarding motion and architecture as flowing


movement/patterns. "Our tissues change as we live: the food we eat and the air we
breathe become flesh of our flesh and bone of our bone." The body and mind are
caught in a cycle of continuous change. “We are not stuff that abides, but patterns
that perpetuate themselves." A pattern which maintains itself is said to have reached
homeostasis.

It is true that one can only see as far as their eyes (and devices), and imagination,
allow them to. Or what they are permitted or financed, to look at, or consider.
Mortimer Adler's book "Intellect: Mind Over Matter" essentially recounts the
thinking of the old Scholastics, which in turn dates back to Aristotle. He examines
the mind's ability to produce what used to be called a 'universal idea', a concept, an
abstract thought, a classification, a category, a generality, a product of mind that has
no physical properties and so cannot be observed by any of the senses. Every
common noun in the language is a universal: 'dog' is not this particular dog or that, or
all the dogs, but the concept describing the attributes every individual dog possesses,
which is to say, ‘dogness’ [Adler often used the word triangle as an example; the
physical brain takes sensory notice of the individual material triangle, but cannot
sense the principle of triangularity because that is not a sensory thing. In nature
things produce effects of a nature like their own, never one radically different in
kind: the brain is made of atoms, but immaterial concept triangularity is not.
Therefore, the mind that produced the concept cannot be made of atoms and must
itself also be immaterial. The ancients said that all knowledge enters first through the
positive outcome of a UBI is manifold: it gives respect to all; it includes all; it liberates human
creativity; it eliminates poverty; it empowers women; it reduces unnecessary government bureaucracy.
The UBI will have a drastic effect on reducing crime since poverty, lack of work, meaning and
education, and unhealthy social environment and family structures, are its root cause. Even more, the
UBI, in creating a fairer society, will help advance environmental issues as people are no longer stuck
in the poverty trap and can now make their community and environment a priority.

026463 130
physical senses. Adler was fond of saying that "We do not think with our brain, but
we cannot think without it."

A civilization can continue improving their sensors and methods, up to the point
where they cannot come up with anything better. But does the fact that nothing new
is discovered mean that there is nothing else to discover? A civilization has the
potential to know everything there is to know, but be unaware of that (since they
continue looking because there is no guarantee they know it all). Our bodies and
sensory gear enables and limits our ability to interact. What we are drawn to or
choose to interact with is also limited by cultural values, beliefs, values, prejudices,
education and even class, our cultural capital (i.e. Bordieu, 1977a, 1977b, 1984)142.
Our bodies and senses, and what how we interpret what we sense and do, are in
effect our interface to the world. Physical and graphical interfaces to systems and
technologies also interface, enable and limit what we can do. So can access and
reception to signals, electrical power, censorship issues, and abilities to afford
subscriptions and so forth. A mobile device which displays text which is too small to
read, provides buttons which are too small to accurately press, that has poor signal
strength, has ran out of battery, or run out of data allowance or pay-as-you-go credit

142
“Bourdieu’s concept of cultural capital refers to the collection of symbolic elements such as skills,
tastes, posture, clothing, mannerisms, material belongings, credentials, etc. that one acquires through
being part of a particular social class. Sharing similar forms of cultural capital with others—the same
taste in movies, for example, or a degree from an Ivy League School—creates a sense of collective
identity and group position (“people like us”). But Bourdieu also points out that cultural capital is a
major source of social inequality. Certain forms of cultural capital are valued over others, and can help
or hinder one’s social mobility just as much as income or wealth.” [source:
http://routledgesoc.com/category/profile-tags/cultural-capital accessed 8/2/2016) Bourdieu
distinguishes between three subtypes of cultural capital: objectified, embodied, and institutionalized
(Bourdieu 1986,p. 47). Objectified cultural capital consists of material items which provide
advantages. In order to utilize objectified cultural capital, it is necessary to have prior skill or
‘competence.’ This knowledge is embodied cultural capital (Kapitzke 2000, p.51). The embodied
form of cultural capital, which can be acquired both consciously and passively, requires an investment
of time and energy from the learner. It is captured in one’s accent, or use of vocabulary or being
widely read, for instance. Further, embodied cultural capital can take on an objective value in the
shape of a library, expensive car, or clothing. This relates to a third type, institutionalized cultural
capital, which occurs when an institution, such as a university or workplace, acknowledges an
individual’s credentials and qualifications such as degrees or titles that symbolize cultural competence
and authority. by formally recognizing skills (Bourdieu 1986). Connected to the types of cultural
capital is the concept of habitus, or a set of dispositions gained in response to objective encounters
which can affect perceptions and decisions (Bourdieu 1977b). Bourdieu, P. (1977a) ‘Cultural
Reproduction and Social Reproduction.’ Power and Ideology in Education 63 (1): 487-511. Bourdieu,
P. (1977b) Outline of a Theory of Practice. Cambridge: Cambridge University Press. Bourdieu, P.
(1984 [1979]) Distinction: A Social Critique of the Judgement of Taste. Translated by Richard Nice.
Cambridge: Harvard University Press.

026463 131
is only good for taking photographs or perhaps telling the time.

Ergonomics and human factors

A laptop keyboard or a smart phone touch screen, or a television remote, what they
can offer how is it limited and challenged (by abilities to manipulate and navigate,
but also aging, eyesight, ill-health, extreme or use in non-typical or non-ideal
environments as simple as touch screen use in the rain, and so forth). Perception,
memory, reasoning, cognitive elements to do with the user also have limits which
can be measured and averaged and factored into design, and continuing work in
neuroscience, and in particular neuroimaging, is casting new light on brain
functioning and confirming, modifying or refuting what previously been put forward
by experimental studies of cognition. Organisational ergonomics is concerned with
the optimization of sociotechnical systems, including their organizational structures,
policies, and processes, how do they create, foster or hinder effectiveness and
efficiency?143

Actions are by and large socially defined and also socially relevant when they are
witnessed or reflected upon. Actions carry meaning that is derived from being
embedded in social and physical environments, environments which are themselves
laden with meaning. This gives rise to social learning. How much have we learned
by watching and listening as others do, and if not learned simply be watching and
listening, have nevertheless been inspired and motivated to try for ourselves? Also
the sites where we participate in media use and consumption can affect use and
usage. Take for instance the home. This is the definitive personal space which carries
with it rich social and symbolic connotations as does much of the stuff which
inhabits it. How does the use of television vary when one is watching from the
familiar place of sitting on a comfortable sofa against watching something while
waiting for a bus on a freezing night? To explore such questions is the beginning of
technology being understood as a complex interweaving and layering of the

143
“Relevant topics include communication, crew resource management, work design, design of
working times, teamwork, participatory design, community ergonomics, cooperative work, new work
paradigms, virtual organizations, telework, and quality management.”

026463 132
influencing and influenced, and what the philosopher of Albert Borgmann hints at
when it offers us his notion of the device paradigm. Succinctly put, the device
paradigm describes the tendency of machines to become simultaneously more
commodious and more opaque, or, in other words, the easier they are to use the
harder they are to understand. By shortcutting the experience for what we are doing
in the world, say washing clothes by hand, and supplanting this activity by the
incorporation of a washing machine to do the same task, the act of washing clothes
while becoming more automated becomes something more of a mystery. And
especially so if for some reason we a forced to start doing it by hand again. In the
same respect we may be made to feel we truly understand everyday life for people in
a distant land to which I have been informed is troubled, by watching snapshot
videos of them on the computer or television. However this familiarity disguises or
hides the entire process which gave rise to us seeing these events in the first place
and which may be quite complex and remain quite opaque. Most of us perhaps more
than ever, have some insight into the complex relationship that exists globally
between governments, military, media and finance. Ease of use comes at the expense
of physical [embodied] engagement, which, in Borgmann’s view, results
pessimistically in an impoverishment of experience and a rearrangement of the social
world.

Technology in this view delimits or intensifies our ability to interact with, and also
our dependence upon, other people, and the world of ideas, nature and objects.144 It
delimits or intensifies our ways and manners of doing things, everyday chores,
making sense of our lives and other peoples, and other technologies [such as in the
internet connected device replacing broadcast television as a means of entertainment
and news]. So does Pierre Bourdieu’s notion of habitus which refers to the physical
embodiment of cultural capital, to the deeply ingrained habits, skills, and dispositions
that we possess due to our life experiences. Bourdieu often used to refer habitus to
being the “feel for the game.” Just as an expert designer has a feel for what will work
to stabalise the video signal without necessarily drawing more processing power
144
Over the past 20 years there has been a surge of interdisciplinary interest in the body and
embodiment, and it is now a key topic of study and theorisation across the social sciences. [source:
Farr, W and Price, S and Jewitt, C (2012) An introduction to embodiment and digital technology
research: Interdisciplinary themes and perspectives. NCRM Working Paper. NCRM. (Unpublished)]

026463 133
from the CPU. It mean making quick informed decision without thinking consciously
of it, like riding a bike once you have learned, or without even having to tinker much.
Each of us has an embodied type of “feel” for the social situations or “games” we
regularly find ourselves in. This is something like Goffman (1959) was speaking of
when he alluded to ‘The Presentation of Self in everyday life’. In these cases habitus
allows us to successfully navigate social environments. Habitus also fashions our
“taste” for cultural objects such as art, food, and clothing. In Distinction (1984)145
Pierre Bourdieu dedicates much to the discussion of how tastes in art often refer to
social class, thereby forcefully arguing that aesthetic sensibilities are shaped by the
culturally ingrained habitus. So when we look for clues to understand the deeper and
wider relevance to how we use designed objects such as television sets, mobile
devices and smart phones, and the media messages that they convey, if we wish to
comprehend something of the contexts of use, we also seek to build on sociological
approaches such as this to understanding everyday life especially in terms of how
emergent technologies, as functioning devices and also possessions, stand to reshape
our experiences of spatiality, temporality and embodiment. Rather than a focus only
upon how companies and their designers attempt to make more usable products
(design, improving ergonomics, technology, purposes and tasks), or only upon how
consumers and users of these products adapt and accommodate for shortcomings in
their design (consumer and user research, media, communication and cultural
studies), both of which relate to improving the chances of profits, the focus here will
be an attempt to explore use and its multitudinous contexts.

Using things

Use is itself used in everyday language as a noun and a verb. As a verb the Oxford
dictionary definition is: “Take, hold, or deploy (something) as a means of
accomplishing or achieving something” or “Take or consume (an amount) from a
limited supply.”146 Using applies to both consumption and the employment of a
means towards an end. It also inherently refers to becoming familiar with something,
145
Bourdieu, P. (1984). Distinction: A Social Critique of the Judgement of Taste. London, Routledge.
146
Source: http://www.oxforddictionaries.com/definition/english/use accessed 1/8/2016

026463 134
becoming used to it. It has negative connotations when we speak of a drug addict as a
user. Indeed there have been those who have described television and all of media as
a ‘plug-in drug’147 or an ‘opium’148. But my issue here is what precisely do we use
television for? Whether it is viewed on a large LCD screen in the home, or on a
smaller screen on the move and on the phone, if I ask you, “what did you use
television for last night?” The question remains jarring and awkward. Certainly we
use the phone to call our friends, or used SMS to send a message, or used the laptop
to write a letter, or used the computer to ‘play’ games. The phenomenology of
entertainment, information removes me from my environment, from my embodied
relations in the here and now, to take me into another world or even another’s world,
with all its concomitant predicaments, decisions and dilemmas. “What I am claiming
here is not that television is entertaining but that it has made entertainment itself the
natural format for the representation of all experience.” (Postman, 1986, p. 87) Don
Ihde and Peter-Paul Verbeek, also philosophers of technology, see television, as all
meditating technologies, differently from Borgman (Rosenberger, 2014) 149. For them
these technologies are multistable; that is, they could be embodied differently in
different contexts, and thus should not be considered inherently negative. The mind
does not like uncertainty and will push perception into the closure of seeing
something familiar. This is why problems of a foreground/background nature always
has us settling on either/or but never both in the same instance. Contexts are
necessary, they give rise to and lend meaning to object and subject, and subject and
object. Object and subject in turn then gives rise to new contexts and so forth.

The central claims associated with this research movement may be summarized in
the thesis that cognition depends not just on the brain but also on the situation or
context in which it occurs. Now, the dependence of cognition on situation or context
147
Winn, M. 1977: The Plug-in Drug. New York: Viking.
148
Moore, W. Television: Opiate of the Masses Vol. 2, Issue No. 2 pages 59-66 The Journal of
Cognitive Liberties 2001 CENTER FOR COGNITIVE LIBERTY AND ETHICS Also Edward
Murrow the noted Amercian News Reporter said in an article in Time Magazine "if those who
control television and radio would sit still for a bit and attempt to discover what it is they
care about. If television and radio are to be used to entertain all of the people all of the time,
then we have come perilously close to discovering the real opiate of the people.” Television:
Opiate of the People Monday, July 15, 1957
149
Rosenberger, R. "Embodiment and the Problem of Television" Paper presented at the annual
meeting of the 4S Annual Meeting - Abstract and Session Submissions, Hyatt Regency Crystal City,
Crystal City, VA . 2014-11-28 from http://citation.allacademic.com/meta/p380471_index.html

026463 135
may be conceived in different ways, which roughly correspond to the ways this
dependence is conceived on the ground of the established paradigms of cognitive
science. Therefore, rather than being a new paradigm of cognitive science, the
“situated cognition” movement is really a symptom of an ongoing paradigm crisis in
cognitive science. In fact, the question where and how cognition is situated
eventually amounts to the question how a cognitive system is to be defined. Given
this background, the aim of this research project is to develop a comprehensive
understanding of the situatedness of cognition. In order to realize this aim, the
research project ties in with two paradigms that are considered to be important
antecedents of “situated cognition”: philosophical phenomenology and general
system theory, i.e. the paradigm that allows a programmatically non-naturalistic
“first-person” analysis of situatedness and the paradigm that allows the most generic
naturalistic “third-person” analysis of situatedness. Tying in with these two
paradigms, it attempts to provide both a non-anthropomorphic phenomenological
analysis of situatedness and a general system-theoretic analysis of situatedness, and
to integrate both into a comprehensive understanding of how to define a cognitive
system, by means of a non-anthropomorphic phenomenological analysis of
intersubjectivity.

Is interaction a game?

Some years ago a discussion was happening in a taxi between Mathematicians John
von Neumann and Jacob Bronowski, both of whom had worked on the Manhattan
Project during WWII. Neumann declared that Chess was not a game, but “a well-
defined from of computation. You may not be able to work out all the answers, but in
theory there must be a right procedure in any position.” He went on to say that real
games and real life; “consists of bluffing, of little tactics of deception, of asking
yourself what is the other man going to think I mean to do. And that is what games
are about in my theory.” (in Bronowski, 2011, p.324) 150 Game theory developed to be
applied to analyse and evaluate situations where individuals and organizations

150
Bronowski, J (2011) The Accent of Man Radom House

026463 136
possess contradictory objectives, such as typified by the Cold war between the Soviet
Union and the Western powers. It analysed the strategic logic of threats, credibility
and conflict. By postulating that players think ahead not just to the immediate
consequences of making moves, but also to the consequences of countermoves to
these moves, counter-countermoves, and so on, we have a scenario loosely like the
arrangement when firms develop and design new kinds of interfaces and services. I
say loosely here as real life and real innovation can be a very messy and
unpredictable affair with no ‘true’ or ‘straight’ path from early inklings to do
something, to the burning desire to do it, to working out tasks and what you need to
reach the goal of making it and selling it. It is not completely open, there is a finite
amount of options at any given time, and there are a finite amount of building blocks.

The well recited histories of technology invention and innovation, the Bells, Edisons,
Marconis, Bairds, Fords, Jobs and so forth show that intention does not mean a
forgone conclusion. Resilience and perseverance count, even intransigence and
mulishness. Most of us will remember Thomas Edison’s quote or one of its variants:
“None of my inventions came by accident. I see a worthwhile need to be met and I
make trial after trial until it comes. What it boils down to is one per cent inspiration
and ninety-nine per cent perspiration.” (Edison, quoted in Newton; 2004, p.24) 151 The
historian of technology Thomas Hughes suggested forthright that “Technology is
messy and complex. . . . In its variety, it is full of contradictions, laden with human
folly, saved by occasional benign deeds, and rich with unintended consequences.”
(Hughes, p.1) 152 So in many, if not most cases of innovation, it is no game of chess,
there is more than simply strategy and intention as its engine, there are serendipitous,
randomising aspects. This is particularly so at the start, when everything is more
fluid and in flux and this can mean anything from selection of components, to
funding, to extoling expert support and opinion, to nurturing, fostering and forming
creative partnerships.

Highly innovative projects and any innovative project in its early stages are more
151
as quoted in Uncommon Friends: Life with Thomas Edison, Henry Ford, Harvey Firestone, Alexis
Carrel & Charles Lindbergh (1987) by James D. Newton, p. 24.
152
Hughes, T.P. (2004) Human-Built World, 1. Human-Built World: How to Think about Technology
and Culture. Chicago: University of Chicago Press, 2004

026463 137
open, poorly fitting, ‘buggy’, open to interpretation and re-interpretation, vaguely
thought through and thought out, possibly even difficult to evaluate, present and
articulate. They are higher risk to investors. Here, then innovation activity is more
like a game where a dice or spinner is involved, not entirely open but not entirely
random. Finite outcomes can randomly click or be clicked into place but nevertheless
make or break success. In the case outlined later we can see how a firm responded to
a series of chance opportunities and alternative innovation pathways that arose
during the development of their project. They opened multiple streams of social and
technical development some more preferred by some managers over others, thus
producing tension in the firm. The maxim: ‘jack of all trades, master of none’ comes
to mind as they struggled to balance and satisfy competing options and values.

A graphic that captures meandering of the design and innovation process from early
stages, where it can take many shapes and forms according a wide range of
influences and pressures, to that time when it settles into a recognisable set of
features and functions, socially recognisable, has been advanced by Damien
Newman and looks like this:

Fig 1 The Process of Design Squiggle by Damien Newman, Central Office of Design is licensed under
aCreative Commons Attribution-No Derivative Works 3.0 United States License.

What has now become known as Newman’s ‘squiggle’ emphasises how design
activity and the stability of the functional and aesthetic aspects of the design smooth

026463 138
towards its implementation and materialisation. As a design move through
development from concept analysis to synthesis, from uncertainty to clarity, through
research, through prototype to design, from ‘softness’ and software to its ‘hardness’
and hardware, from its fluidity to its crystallisation as a concept, service, interface,
middleware, system, device or artefact. One can see that it is only suddenly, at the
prototype stage, that real stabilisation in component shape and form appear. What
performs the smoothing is the myriad strands of activity and which design demands:
the search and researching for materials, components and meaning, the exposure and
the learning, the decision making, discussion and argumentation, and of course the
material tinkering, fixing, playing and building. A significant part of the learning is
purely to do with getting the technology to work, and part is to do with what I have
described as ‘meaning’ - how it comes to be seen and valued by relevant others
which can include, partners, senior managers, possible consumers and users, and
regulators, media, and the more general public.

If we think about it Newman’s graphic is socially and organisationally scalable. It


can be related to the efforts made by a single designer, or it can refer to the efforts of
a team of designers. It can refer to a cross-functional team within a larger industrial
concern and even those comprised of virtual members globally distributed. It can
refer to a group of managers arguing around the table about the characteristics,
attributes, features and functions of a new system or a new component, the costs and
the difficulty of design and interoperability with other devices and software. It can
even refer to a consortium of firms coming from either related industries [as is the
case here where local and related firms [produced different component parts of the
overall technical system] or potentially completely different industry sectors [as was
the case here in the development of the services and content]. It can impact upon
entire industries and through the agency of standards committees and regulatory
structures influence the dominant paradigm in terms of the way things are done.

At the largest possible social level it could even relate to the entire world of haves
and have-not economies, those developed post-industrial economies which design
and create advanced media devices, and those others who only in a position to

026463 139
consume them. One benefit of such an arrangement is that developing economies can
‘leapfrog’ the costs and times required to develop the new technologies such that one
can see the latest smart phones on sale in shops in Cambodia sometimes even before
they are on sale in developed nations.153 And of course the internet has been a great
leveller when it comes to the availability of software and software tools and
application globally. When we consider the vast diffusion of mobile phone telephony
across the developing world we can often similarly note the unique manner in which
they support everyday life at work and domestically indicative of where an advanced
technology encounters a completely different sociocultural environment to where it
was developed. In a recent survey of developing economies the U.S. based Pew
research Centre highlighted that internet and mobile technology have become
ubiquitous in the emerging and developing world.

Telephony, as it is almost entirely user driven in terms of what it does or delivers is


by nature a bit of a blank slate. It matters little if it is someone relaying the latest
gossip or farmers in Africa relating market prices to each other. But in the case of
where systems and technologies deliver more specific or specialised goods,
programmes or services, niche markets arise with their own legacies and
conventions. McDonalds is received very differently in terms of attitude towards its
offerings globally.154 Contradictory or even misunderstood, or poorly informed
objectives, arguments, happen where alliances and consortia of different kinds of
partners, from quite different industry sectors, with quite different agendas and
corporate cultures, join together to research, build, test and prototype a new system.
Not only are they involved for the purposes of learning how to work with the system,
but how they may cope with others who may wish to use it for quite different
purposes. Also, the motivation that drives such social arrangements varies and can be
described as firms aligning themselves by a shared sense of purpose, or through other
motives such as competitive fear, or even fear of losing out on the next big thing
(Molina).

153
The limits of leapfrogging The spread of new technologies often depends on the availability of
older ones Feb 7th 2008 http://www.economist.com/node/10650775
154
https://hbr.org/2004/09/how-global-brands-compete How Global Brands Compete
Douglas HoltJohn QuelchEarl L. Taylor
Harvard Business revue SEPTEMBER 2004 ISSUE

026463 140
This can lead to outcomes such as ‘design by committee’ or ‘groupthink’ which are
disparaging terms related to the proverb of the blind men and an elephant. A group of
blind men (or men in the dark) who touch an elephant to learn and report upon what
it is like. Each one feels a different part, but only one part, such as the side or the
tusk. When they compare notes they learn that they are in complete disagreement
about what it is and arguments ensue. This kind of outcome contrasts with the
‘wisdom of crowds’ notion which has seen renewed interest in the digital age where
vast aggregates of online users and audiences exist. The undergirding argument is
based upon the natural sciences, probability and statistical analysis, and has it that the
aggregation of information in groups results in decisions that are often better or faster
than could have been made by any single member of the group. There are economic
factors at play as well. The point is that having digital copies of single sellers, or
those selling to 10 or 100 people, is cost effective for a business when compared with
having them all available in a retail store. Amazon and iTunes still make a huge
profit from amalgamating all those small numbers, just as Google does with
Advertising, and Facebook, Twitter and other social networks are attempting from
consumer-user data. But the point is that Amazon or iTunes have a large enough
collection of those individual copies sold, which aggregated makes it worthwhile.
Just the same as the average Google Adsense advert makes almost nothing, but
Google makes a lot of money from the aggregation of search results pages and
websites. When it comes to digital, the cost of storing and making an additional file
available is tiny compared to the cost of stocking a physical copy, which allows that
economy of scale to happen. It’s always been the case that the majority of any
product type won’t sell – it’s the Pareto principle in action. But the idea of The Long
Tail is that you can now make a decent return by collecting all the misses together.
The success stories are for the businesses doing the aggregating, not a creator trying
to make a living. They still need to find a way out of The Long Tail to be able to
survive, although various mechanisms mean that the number they need to reach
could be potentially a lot smaller than in previous eras. The idea of the 1000 true fans
that Kevin Kelly wrote about is closer to reality across a range of businesses as well
as the creative arts. There’s no mechanism that means you can make a living from

026463 141
selling a handful of copies of any work at the normal market price, digital or
otherwise, and no-one sensible has ever claimed otherwise…

By throwing things out there, by beta releases, by open source, the chances of finding
solutions, or at least someone finding a solution improves. Thinking and information
processing such as ‘market judgment’ Surowiecki argues can be much faster, more
reliable, and less subject to political forces than the deliberations of experts or expert
committees. Surowiecki examines how common understanding within a culture
allows remarkably accurate judgments about specific reactions of other members of
the culture. How groups of people can form networks of trust without a central
system controlling their behaviour or directly enforcing their compliance. This
section is especially pro free market. Surowiecki breaks down the advantages he sees
in disorganized decisions into three main types, which he classifies as cognition,
coordination and cooperation.155 The wisdom of the crowds relies upon not only
democrat processes but also upon the individual diversity and differing opinions and
views of participates. In his book Turtles, Termites, and Traffic Jams (1994) Mitchel
Resnick introduces the ideas of decentralized systems that result in different layers of
emergent phenomena. He mostly addresses the question of why decentralized
systems are important, and how we can teach decentralized thinking to children, and
does not really write about how emergent systems work, or why. Iain Couzin 156
studies animals that move in shoals, flocks and swarms. He has found an example of
a more exciting type of collective intelligence - where a group solves a problem that
none of its members are even aware of. Simply by moving together, the group gains
new abilities that its members lack as individuals. The individual fish aren’t tracking
anything. That would involve realising, for example, that it’s getting darker over
there compared to over here, and swimming over there. Instead, they obey one very
simple rule - swim slower when it’s dark. Each fish just reacts to how bright it is in
its current position. How bright or dark is it right here? That’s a scalar measurement.
It’s the shoal that converts these local readings into a vector.

155
The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom
Shapes Business, Economies, Societies and Nations, published in 2004, is a book written by James
Surowiecki
156
Berdahl, Torney, Ioannou, Faria & Couzin. 2013. Emergent Sensing of Complex Environments by
Mobile Animal Groups. Science http://dx.doi.org/10.1126/science.1225883

026463 142
So returning to Newman’s graphic, it can also refer to a group of individuals or
enthusiasts or designers of various hues who come together online to discuss
tinkering with and making things, basically learning by doing things and sharing
knowledge. The nature of mammalian development is the ability to learn things
vicariously in addition to exploring and doing. The viral spread of cell phones in the
developing world did not happen one would guess by Nokia airdropping phones n
remote villages. It likely began with rumours that farmer X was making some
advantageous in his efforts to sell his crops by being able to phone an agent in the
town or city to check on prices, or to make or receive payments.

The recent release of a range of small microprocessors such as the Raspberry PI,
Aduino, and Intel Galileo is nothing short on a material rallying call for ‘grassroots’
development of a range of IT led devices, most notably robotics and the internet of
things (IoT). In a sense this is a social and technical consumerist phenomenon which
has a very distinctive pedigree in technical and media development. It characterises
the origin of many domestic media technologies, and shows them to be, in varying
respects user-led innovations, in particular radio and the personal computer. The
development of these, while relevant overall to the technology theme of the current
work, also serve as a comparison which I will outline here.

Shaun Moores (1988, 2000) drew our attention to the role of enthusiasts in the dawn
of radio broadcasting when he and colleagues interviewed people regarding the early
beginning of radio in the UK: miscreant

“Audiences for radio tended to reflect the technological novelty of the


medium. Consumers concerned themselves above all with the means of
reception as opposed to programme contents – and the interview material
suggests it was mainly young men, caught up in the play of experimentation,
who were listening to broadcast transmissions. There are many recollections
of male relatives or neighbours constructing their own radio receivers at
home.” (Moores, 2000, p.43)

At the time that radio began there was an abundance articles devoted to the

026463 143
construction and operation of such gadgets in dedicated periodicals. In the years
following the First World War many former military radio operators became amateur
radio enthusiasts, tinkering with their home-made sets to pick up transmissions, and
transmitting their own talks or music. They used radio to share their discoveries,
forming a community of fellow experimenters. These periodicals also advertised the
sale of ‘do-it-yourself’ sets and components, which one could order and have
delivered by post. “A home-built receiver could be bought at a vastly cheaper price
than the already manufactured sets which were on sale in shops.” (Moores, ibid,
p.44) The BBC itself had since its inception, evolved rapidly as an institution, even
in terms of its own financing. In 1922 the BBC transmitted its first radio programme,
marking the beginnings of official, state broadcasting, and a new era for listeners at
home. On 14 November 1922, Director of Programmes at the British Broadcasting
Company, Arthur Burrows, launched Britain’s first national radio broadcasting
service from Marconi House in the Strand, London. It was financed by a royalty on
the sale of dedicated wireless receiving sets only available from approved
manufacturers, and also by a licence fee which would be paid via the British postal
system who also were the government department responsible for communications. It
was also they then, who granted licenses to broadcast. The BBC was the only body to
be granted a licence, and in effect then had a broadcasting monopoly, dependent
upon a few criteria, most prominent of which was a prohibition on advertising.
Another was no news bulletins before 19:00. They were also required to source their
news from external wire services. 157 This was to consciously steer British
broadcasting away from the more open ‘wild west’ style of free market which was
seen happening in the United States. However this financial model broke down as
‘approved’ set sales were disappointing as amateurs started to make their own
receivers and listeners bought rival unlicensed sets. This placed greater emphasis on
garnering revenues from the license fee, a legacy which continues to this day where
UK broadcast [and online and mobile] television viewers must pay by law, an annual
licence fee which funds the BBC, whether they watch their channels or not.

157
The newswire services influenced the flow of international news first along the imperial routes,
then through the news agencies Havas, Reuters, Wolff and Associated Press. Rhodes, L ‘International
communication and News Networks’ Journalism and Mass Communication Vo. 1

026463 144
Moores (p.47) noted that early radio receivers which had previously used battery
power, but by 1932 they were increasingly running off mains electricity. Being
nested in among the proliferation of electrical devices for the home, this was
positively affecting sales and helping them ‘domesticate’ – that is being more an
integrated part of home life than a fringe hobbyist pursuit. Manufactured mains sets
were dropping in price making them affordable to those who were never drawn to
tinkering. Moores further notes that the home environment was also changing
considerably; it was becoming a 'more comfortable’ place to live in, with soft
furnishings and pleasant decos. Moores quotes a winter issue of the Radio Times in
1935 that captures something of this idea; “‘To close the door behind you, with the
curtains drawn against the rain, and the fire glowing in the hearth – that is one of the
real pleasures of life’. ‘And it is when you are settled by your own fireside’,
continued the copy, ‘that you most appreciate the entertainment that broadcasting can
bring’.” (quoted in Moores, op cit., p.51).

Radio sets [and later also television sets] were fashioned with the home environment
in mind, with ornate wood finishing so as to ‘fit in’ and become [an essential] part of
the furniture and home deco. This brought the devices much more into favour with
female family members who were somewhat excluded previously by headphones, by
lack of knowledge or interest, or by the mess of wires, and components, and even
danger of radio materiality, i.e. battery acid, solder, the industrial and workshop style
smells of the previous contraptions. Technically, loudspeakers were replacing
headphones as the means for listening and which now orientated the practice of radio
listening towards being a social, rather than individual and exclusory event for the set
builder. There was also significant development in, amplification, tuning and
interference reduction. But it was programming, the art of broadcasting, which was
formalising, formatting, evolving and proliferating. Broadcasting signalled that: “The
days of the boffin, experimenting with set construction, were coming to an end.
Programme content was fast gaining in importance for audiences, over and above the
actual means of communication and the pleasures of technological ‘tinkering’.”
(Moores, ibid, pp.47-48).

026463 145
The story of the early radio pioneers compares and contrasts with those who built
their own small computers even before the first microcomputer kits appeared. There
are also similarities, certainly in the gendering of the activity. Perhaps most famously
in the case of the personal computer we can see the rise again of male-dominated
enthusiast groups, and the most famous of them all would be the Homebrew
Computer Club of 1970s Silicon Valley which gave rise to Apple Computer Inc.
Stephen Wozniak the partner of Steve Jobs in the founding of Apple has consistently
remarked that “Without computer clubs there would probably be no Apple
computers.”158 Leslie Haddon has also remarked form a British perspective regarding
the necessity of open minded, tinkerers and lead users who make up the enthusiasts
and who help take nascent technologies; “from a marginal, esoteric, hobbyist item, to
yet another addition to the familiar product range called 'consumer electronics.”
(1988, p.7)159 Most importantly Haddon (Ibid.) notes that there was a difference in
terms of the shaping of use with respect to personal computers over the origins of
radio. The difference was with respect to the agency which went into developing
programming. Early listeners certainly built their own radios, but the benchmark for
their success was being able to pick up signals from the broadcasters.

“However, once computer hobbyist communities started to appear these


enthusiasts were encouraged by peers, clubs, and the hobbyist press to
explore the possibilities of this technology. The hobbyists developed a range
of innovatory projects and ways to use the microcomputer. The legacy of this
period of hobbyist inventiveness was the development of what became the PC
industry. Hobbyists gave the product area early visibility that eventually
helped it to become a consumer electronic, developing, in particular, the
numerous computer games that helped shape the interactive games industry.”

This is significant as the suggestion here is that use and pursue of having a personal
computer came from the grassroots, rather than being led by a large corporation
(such as broadcasting was with the BBC). In this enthusiast group the question still
remains. Are all their objectives the same; does their vision of what they are doing,
what is going on, what they have to do to make it work satisfactorily, aligned,

158
159
Haddon, L. The home computer: the making of a consumer electronic. Science as Culture, 1988; 2:
7–51.

026463 146
contradictory, conflictual or complimentary?

One can understand that the slack diverging objectives shifts with the scale of the
enterprise and the heterogeneity of the actors. How much room, slack for
contradictions, misinterpretations, misunderstandings or even misrepresentations?

Consortia built Consortia built Members of a


with Consortia built Design
with management with design team with
representatives representatives different and
from divergent representatives teams within a
from firm built with from divergent complimentary
industry industry sectors expertise
sectors [i.e. complimentary representatives
organisations from different [i.e. media, [software
media, regulators, development,
regulators, [i.e. chip functions [i.e.
manufacturers, manufacturing technology, programming,
technology, retail, hardware
retail, software , R & D,
developers, Marketing, distribution, development,
distribution, finance etc. interface design
video card Finance etc.]
manufacturers etc.]
etc.

I want to mention one more area that is pertinent to this discussion and which can
refer a different take on Newman’s ‘squiggle’. It could also refer to the way ‘smart’
or ‘intelligent’ devices are supposed to support everyday functioning. Lacking any
direct user input, they require a design and planning intelligence that not only
presupposes what a technology will be used for, but why, how, where, when, and for
how long it will be used. In the absence of any form of direct user control or input
they draw their operational parameters from three aspects. These can be related to
human theories of learning. Indeed according to John Dewey (Deway, 1938, p.69), in
his ideas regarding the formation of purposes there are three dimensions,

026463 147
observation, knowledge, and judgment, through which smart products [and humans]
can learn:
(1) Observation of surrounding conditions; this could involve sensory data
from a variety of sensors responsive to movement, light, heat etc.
(2) Knowledge of what has happened in similar situations in the past. This
would relate to a corpus of data [system-logging] committed to memory and
developed from previous use and environmental episodes and particularly
noting patterns; any learning machine needs sufficient representative
examples in order to capture the underlying structure that allows it to
generalize to new cases. There is room for breakdown here as knowledge is
somewhat finate.160
(3) Judgment which puts together what is observed and what is recalled to see
what they signify. This is the critical aspect of smart of intelligent
functioning. It requires the input of values, norms, beliefs, laws, constraints
which denote relevance and initiate action. These serve as the hidden
curriculum, the ethical dimension in design.

That is, they are designed to respond ‘naturally’ or ‘intuitively’ to the contexts of
use, and, just as importantly, to the expectations, objectives possibly even delight of
their ‘users’, or perhaps more precisely, in concordance with the habits and
conventions of their ‘users’. Such devices without manual controls or forms of
interface can only compliment if they are congruent with short-cutting the task whilst
remaining experientially invisible to the ‘user’.161 They are an invisible hand.

If the effects or outcomes, the functioning of such technologies do not align with
whatever the person, the user, perceives ‘should happen’, or what they hold as true,
normal, or ethical then they run the risk of serious breakdown. They will jar and
become an obstacle with ‘no user serviceable functions’ and therefore alien,
160
On 12 February 2002 at a press briefing, The United States Secretary of Defense, Donald Rumsfeld
famously referred to ‘unknown unknowns’. He was referring to the absence of evidence linking the
government of Iraq with the supply of weapons of mass destruction to terrorist groups. He bewildered
many people when he went on to say: “There are known knowns; there are things we know we know.
We also know there are known unknowns; that is to say we know there are some things we do not
know. But there are also unknown unknowns – the ones we don’t know we don’t know.”
161
I use inverted commas here to emphasise that the nomenclature ‘user’ and ‘use’ becomes somewhat
awkward and redundant with this technology. It is clear that these terms link and refer to instances of
‘purpose’ and ‘task’ which remain solely in the realm of those commissioning or designing the
technology. An architect can designate and argue for the installation of automatic doors and windows,
perhaps to solve a aesthetic, functional or safety problem. A designer can build a sensor apparatus
which triggers the door or window to open or close. But the ‘user’ is a passive player in these
scenarios; they are not required to interface with these systems with respect to their operation. They
are imagined as going about their daily business which the technology supports. In other words to
quote Sharrock and Anderson (1994) these users in particular become ‘scenic’ in design. We must
realize that even in the ‘use’ of television, while awkward the term ‘user’ can be described in terms of
television’s use as an entertainment device, a pastime, a source of information and so on.

026463 148
adversative and antithetical. There are at least a couple of famous Disney skits
which show the disaster of automation as magic, perhaps the most famous is Mickey
Mouse as a magician whose broom, which is supposed to be cleaning the house,
goes berserk and wrecks the place. Mickey falls asleep and dreams that he is a
powerful wizard controlling the mighty seas and starry skies; awakening to find that
the basin is overflowing and the broom is still filling it up. Mickey races to the
wizard's spell book looking for a counter-spell, but to no avail. After nearly
drowning in a giant whirlpool, Mickey is rescued by the wizard, who magically halts
the flood and causes the brooms to vanish. A more ominous example would be the
HALL 2000 computer in 2001: A Space Odyssey which instead of doing its jobs in
supporting the crew, decides to act in its own favour and according to its own ‘will’.
In other words such technology will not only appear in your face, but be a liability
and a nuisance. Dystopian science fiction has given many variations on this theme,
and in every occasion what is being questioned is the ethical dimension of a human
made device, robot, or intelligence which no longer serves the direct interest of
people, but instead moves to perpetuate its own kind of being its own logics. 162 In a
sense it is the ‘judgement’ aspect, the ethical domain which comes to the fore in
design and functioning.

Imagine a lift designed to work out which floor a person wants to go. This could be
based upon some sort of historical data such as the average speed through which they
enter the lift when they go to a certain floor. As it turns out, it is far more difficult to
design robots that can navigate a 'messy’ environment than it is to design robots that
can memorize more than a human brain. One of the early proponents of autonomous
driving, Pomerleau (1993) noted that: “Many real world problems quire a degree of
flexibility that is difficult to achieve using hand programmed algorithms.” He used a
neural network to train a robotic vehicle to drive on multiple types of roads (single
lane, multi-lane, dirt, etc.). A large amount of his research is devoted to (1)
extrapolating multiple training scenarios from a single training experience, and (2)
preserving past training diversity so that the system does not become overtrained (if,

162
This is also the view of sociobiology and evolutionary psychology, where Dawkin’s selfish gene
idea proposes a supra-human or trans- human view of evolution, that even the most far-flung human
activity, law or creation is aimed, one way or another, at perpetuating the species.

026463 149
for example, it is presented with a series of right turns – it should not learn to always
turn right). The pursuit of autonomous driving vehicles continues today.

Program or be programmed

We have spoken somewhat about how objectives align or diverge between different
actors involved with the production of systems, services and technologies. We have
also introduced that the drive towards automation and digitisation, of making things
smart, which has long origins with the advent of algorithms, assembly lines,
computers and robots persists. But what of users? Are designer and user objectives
the same? This is the big question in innovation. How about producer and consumer
objectives? In which way do they align and in which way do they oppose each other
and contradict? When users encounter a new device, service or tool we can recognise
the foundation for a second adaptation of Newman’s ‘squiggle’.

In this occasion it can refer to purchasing research and decisions until one is sure that
this is the right thing for them, or how one explores and tinkers with a new smart
phone, before becoming adept at its use. In a sense here we see that the device or
software programs the user, just as the user uses it to accomplish some task.
Exploratory use becomes exert use as time progresses, and practice continues.
Certain features may become redundant, or relegated. There is a trade-off between
expertise and flexibility. Research suggests that as one acquires domain expertise,
one loses flexibility with regard to problem solving, adaptation, and creative idea
generation. Expert bridge players struggled more than novices to adapt when the
rules were changed; expert accountants were worse than novices at applying a new

026463 150
tax law (Dane, 2010).163

It can refer to a new web site, or an interactive television service, or an application


such as Microsoft Word. But the most pressing analogy Newman’s ‘squiggle’ offers
with respect to use is where it denotes what may be described as the resolution of
feedback which can be garnered from the device, from product concept testing,
through paper mock-ups, through ever more substantial and manifest prototypes,
through to an almost fully operational and functioning system. As the development
passes through its various design phases, not only does the methodology change, but
the opportunity to get ever more reliable and pertinent feedback improves relative to
what will eventually roll out into the consumer market. Perhaps the epitome of user
trials is that which is detailed here where there is a concerted effort made to situate
the product and services in their intended naturalistic place of use, and produce a
simulacrum of their commercial functioning.

By the 1990s this was a well-established practice with respect to industrial


deployments of ICTs and this can be read in the work of people such as Wanda
Orlikowski who advocated a practice-based understanding of the recursive
interaction between people and technologies over time. In a sense it is true to say that
there is no replacement for the market or the place where the technology will be used
in reality. Orlikowski (2000) argues that emergent structures offer a more generative
view of technology use.164 This idea also appeared in Fleck's (1988, 1993) notion of
innofusion (a blend of innovation and diffusion) where he suggests that innovation of
particular technologies (his example was industrial robots) would only be fully
realised, through their diffusion into sites, existing expertise, and process of
implementation.

There are so many examples and they will form much of the discussion in the earlier
thematic part of this work. But the issue which will be dwelt upon is how producers

163
Dane, E. (2010) Reconsidering the Trade-off Between Expertise and Flexibility: a Cognitive
Entrenchment Perspective ACAD MANAGE REV October 1, 2010 vol. 35 no. 4 579-603
164
Orlikowski’s more recent work argues that that our primary ways of dealing with materiality in
organizational research are conceptually problematic and proposes an alternative approach that posits
materiality as constitutive of everyday life.

026463 151
and designers try in their process of design and technical system building to think in
another’s shoes or from their perspective or worldview. This has become ever more
necessary as devices boast more ‘smart’ features and can also be the kind of thinking
necessary when a designer designs applications, creative tools and systems that allow
user-created, user-generated or user-controlled content. This includes various forms
of written, audio, visual and combined media created and shared by internet users.

Ucc and users doing all the work

Times magazine has a tradition of selecting a "Man of the Year", which now reads as
“Person of the Year” which began in 1927. Whoever is chosen is who the editors
consider the news makers of the year. The first was the aviator, Charles Lindbergh,
and subsequent entries including Gandhi, Roosevelt, Truman, Khrushchev , Nixon,
Gorbachev, Clinton. But it also included groups of people. In 1960 ‘American
Scientists’ featured as man of the year, ‘The inheritors’ (the baby boomers) in 1966;
‘The crew of Apollo 8’ in 1968; ‘The computer’ in 1982; ‘The Whistleblowers” in
2002; and on the 13th December, 2006 ‘You” was featured as person of the year.165
This award recognized the millions of people who anonymously contribute user-
generated content to wikis (including Wikipedia), YouTube, MySpace, Facebook
and the multitudes of other websites featuring user contribution. The origins and
legacies of user-generated content go back to the early broadcast experiments by
radio enthusiasts (‘ham radio’) and perhaps even earlier by the millions of
unpublished authors, artists, and musicians who have dabbled creatively in their lives
and have remained ‘undiscovered’. The internet and in particular creative digital
tools and sites dedicated to the creation of ucc have opened the prospect of
publishing to all. However, it is not without criticisms and accompanying the Time
accolade to the user was this impassioned response:

“For those times when the 900 digital options awaiting us in our set-top cable
box can seem limiting and claustrophobic, there's the Web. Once inside, the
doors swing open to a treasure trove of video: adults juggling kittens, ill-fated
dance moves at wedding receptions, political rants delivered to camera with
165
times person of the year: you, 13 December 2006.
Http://www.time.com/time/magazine/article/0,9171,1569514,00.html

026463 152
venom and volume. All of it exists to fill a perceived need. Media
executives--some still not sure what it is - know only that they want it. And
they're willing to pay for it … The larger dynamic at work is the celebration
of self. The implied message is that if it has to do with you, or your life, it's
important enough to tell someone. Publish it, record it ... but for goodness'
sake, share it--get it out there so that others can enjoy it. Or not. The
assumption is that an audience of strangers will be somehow interested, or at
the very worst not offended … While the media landscape of my youth, with
its three television networks, now seems like forced national viewing by
comparison, and while I anchor a broadcast that is routinely viewed by an
audience of 10 million or more, it's nothing like it used to be. We work every
bit as hard as our television-news forebears did at gathering, writing and
presenting the day's news but to a smaller audience, from which many have
been lured away by a dazzling array of choices and the chance to make their
own news … It is now possible - even common - to go about your day in
America and consume only what you wish to see and hear. There are
television networks that already agree with your views, iPods that play only
music you already know you like, Internet programs ready to filter out all but
the news you want to hear … The danger just might be that we miss the next
great book or the next great idea, or that we fail to meet the next great
challenge ... because we are too busy celebrating ourselves and listening to
the same tune we already know by heart.”166

Brain Williams is an anchor and managing editor of NBC Nightly News and what he
refers here to is the echo chamber, the dangers of cocooning oneself only in one’s
own interest and predilections. Not only does this run the risk of making us ever
more myopic and by default narrow-minded with the implications regarding
prejudicial worldviews that that entails, but from the prospect of democratic
viewpoints, which rely upon we are impoverished. Even the ‘Wisdom of the Crowds’
suffer as it was clearly indicated that to make good decisions using crowd mentality
relies on divestiture in opinion and worldview, in other words diversity.

Most user-created content activity is undertaken without the expectation of


remuneration or profit. Motivating factors include connecting with peers, achieving a
certain level of fame, notoriety or prestige, and self-expression. A recent addition to
166
Enough About You We've made the media more democratic, but at what cost to our democracy?
By Brian Williams Monday, Dec. 25, 2006. Interestingly, while Williams was an
outspoken critic of the dangers of user-generated content and an advocate of professional journalism
and media, it nevertheless became embroiled in a controversy which came to a head in 2015 regarding
erroneous and false reporting of Iraq. Brian Williams has been suspended for six months without pay
following revelations that he exaggerated tales from an Iraq War mission in 2003. Considering that 43
percent of young people recently said they rank authenticity over content when they're consuming
news and are just as likely to pick user-generated content as they are traditional media, does it really
matter?

026463 153
Facebook has been Facebook Live. Those with accounts can essentially make their
own broadcasts from their smart phones. Each broadcaster can be viewed as a dot on
a world map, and by simply gliding the mouse cursor over the map one can glimpse
into another’s world or space. Mixed with local radio and television stations, one is
also exposed, or rather individual are exposed to the world dancing, pulling faces,
playing guitars, eating food, or simply staring back at you in the screen. While
Facebook live is not the first service of this kind, apps like Periscope and Meerkat
provided some evidence that the next ‘big thing’ is live broadcasting from your
smartphone. Facebook originally offered this functionality for celebrities, but now it
has now been rolled out globally, as a bolt on extra to the Facebook brand. It is made
easy and convenient for those members to broadcast at a touch of a button. Most ucc
sites have been start-ups or non-commercial ventures of enthusiasts, but commercial
firms are now playing an increasing role in supporting, hosting, searching, curating,
aggregating, filtering and diffusing ucc. Indeed, these define much of these sites
functioning.

It is easy to see the concentrations of use of this feature. It is also a commentary on


the place, relevance and role of ‘professional’ production values and ethics. We are
all tutored, indirectly, to what is ‘right’ and ‘good’ television. Given the paucity of
compelling programming, of narrative, of story, it becomes more like the experience
of a human zoo than what we commonly relate to as the television experience as
citizen broadcasters struggle to go beyond the statement “I am here, this is my room,
what I can do if someone was watching me through my phone?” We begin to realise
something of the creative power of all those who a listed in the film’s credits, the
vast creative, economic, social and technical undertaking which translates into the
latest blockbuster or broadcast television show. Defining an economic value-chain
for ucc is thus more difficult than the typical media distribution model.

Most models are still in flux and revenue generation for content creators or
commercial firms (e.g. Media companies) are only now beginning. Different ucc
types (e.g. Blogs, video content) have different although similar approaches to
monetising ucc. There are five basic models:

026463 154
i) Voluntary contributions,
ii) Charging viewers for services pay-per-item or subscription models,
including bundling with existing subscriptions,
iii) A-based models,
iv) Licensing of content and technology to third parties, and
v) Selling goods and services to the community (monetising the audience via
online sales).
These models can also remunerate creators, either by sharing revenues or by direct
payments from other users. As mentioned opposed to the traditional media
producers such as broadcasters and production companies, who were set up
specifically to work in the manufacture and sales of creative goods, broadcasting,
video and sound recording and editing, and latterly easy availability to website
design reflects the democratisation of media production. It is made possible through
new technologies that are accessible and affordable and include digital video,
blogging, podcasting, mobile phone photography and, of course, wikis. Prominent
examples of websites based on user-generated content include flickr, youtube and
wikipedia. The advent of user-generated content marks a shift among media
organisations from creating online content to creating the facilities and framework
for nonmedia professionals (i.e. the public) to publish their own content in prominent
places. The four drivers of this are:
1. Technological (e.g. More widespread broadband uptake, new web
technologies which facilitate the posting, rating and aggregation of
data).
2. Social (e.g. Demographic factors, attitudes towards privacy).
3. Economic (e.g. The increased commercial involvement of internet
and media firms in the hosting of ucc).
4. Legal (e.g. The rise of more flexible licensing schemes). User-
created content is a phenomenon with major social implications.

Changes in the way users produce, distribute, access and re-use information and
entertainment give rise to: increased user autonomy; increased participation,
increased diversity and increased creativity. The use of such technology also
provides people with improved ICT skills. As an open platform, ucc increases the
free flow of information and freedom of expression, as well as enriching political and
societal debates and broadening diversity of opinion. User-created content has also

026463 155
emerged as an important area of economic activity. Ucc is redefining the role of
those industries that traditionally supply content. Some major media and internet
companies have been quick to see the potential financial opportunities. This is
indicated by some high-profile acquisitions of: YouTube by Google for USD 1.65
billion (2006); and MySpace by news corporation for USD 850 million (2005).
Different business models have emerged for ucc, some of which are based on:
advertising, voluntary contributions or charging viewers for services. Finally, ucc
raises a host of new policy questions apart from standard policy issues such as
ensuring widespread broadband access and innovation, questions are emerging
around whether and how governments should support ucc. It is important, for
example, to maintain a competitive playing field among ucc hosting companies.
Other policy issues of concern relate to intellectual property rights, content quality
and accuracy, issues surrounding privacy and identity theft, and regulatory questions
arising from the increasing prevalence of virtual worlds. Report says ucc sites have
exploded but have limited potential for online advertising,

Drawing on both online and offline participant observation, as well as experimental


ways of writing culture, my doctoral project seeks to open new ways of conducting
sociological research that firmly position our work within the embodied and situated
practices of everyday life. Through recombinant textual strategies that encourage
listening over telling, and often description over explanation, my dissertation
presents a multivocal and multiperspectival account as a pleated or layered text.
Weaving together theoretical and analytical discussion with scholarly quotes,
questionnaire and interview excerpts, blog posts, news stories and personal
reflections, readers are invited to join in their entangled differences—as active
producers of their own knowledge rather than as passive consumers of academic
wisdom. I argue that the validity and value of such an approach may be found
precisely in its ability to avoid presenting a single voice or point of view, and to ask
more questions than provide answers.

Just as there are no platonic solids found in nature, so there are no perfect products.
Material and functional constraints and limitations, always have to meet with human

026463 156
abilities to compensate, compromise and generously accommodate for design
shortcomings. While I may love my smartphone and what it does for me, I hate it
running out of power, signal, credit [if you are pay-as-you-go), data, memory, or
processing ability. I may dislike the way I cannot see the screen in the sun, I may
dislike the fact that no matter how small they make them, it is still too bulky in my
pocket. But despite all that, to reiterate what stated in the opening to this chapter, I
would still rather have one than not. I have become accustomed to its use.

In developing new solutions contradictions may arise. The TRIZ problem-solving


method views that a contradiction is a simple clash of solutions (Gadd, 2011). Either
we want opposite solutions, or by introducing a new solution, i.e. an improving
change to one feature in a system, another feature in our system has got worse.
Engineers recognise contradictions as familiar situations, such as when we improve
strength by adding more material we find that this solution often makes weight get
worse.

“Contradictions can also be the need for opposite benefits which are achieved
with opposite features or functions - an everyday item such as an umbrella
has the benefits of being both large and small. there are many situations of
wanting opposites such as white and black - TRIZ shows us all the way to
achieve such opposite benefits.” (Gadd, ibid, p.98) 167

While, the traditional area of application of TRIZ is in technical and engineering


problems, i.e. technical systems and technological processes. However, it is also now
being applied to ‘softer’ non-technical problem areas such as management, public
relations and investment (Savransky, 2000) Gummesson (1995, pp. 250–51) states
the following: “…The shift in focus to services is a shift from the means and the
producer perspective to the utilization and the customer perspective.”

Some decisions and avenues of design or user activity seem to defy logic, certainly
technical or design logic. Many are subject to other influences, i.e. what are to hand,
prior beliefs, experience using similar or related devices, lack of finance, interest to
167
TRIZ for Engineers: Enabling Inventive Problem Solving By Karen Gadd (2011) John Wiley &
Sons

026463 157
show off etc. This is not easy to design but it can be incredibly easy to create.

For more than half a century now, psychologists, linguists, neuroscientists and other

experts on human behaviour have been asserting that the human brain works like a

computer. John von Neumann stated flatly in The Computer and the Brain (1958)

that the function of the human nervous system is ‘prima facie digital’, drawing

parallel after parallel between the components of the computing machines of the day

and the components of the human brain. This has permeated through design to what

we use. Entrenched in the backwaters of designer “models of the user” is therefore

the "mind as computer" metaphor. This derived largely from the information

processing (IP) metaphor of human intelligence derived from cognitive science,

whenever it entails the reduction of human meaning creation to some computational

mechanism. ELIZA is a famous dialog program written in 1966 by Joseph

Weizenbaum at MIT168 largely as a hoax a bit of fun. It is often used as an example

of either the gullibility of people or how intimate relations are not based upon

complex ideas and ethics, such as empathy.

ELIZA’S name is derived from the famous working-class flower girl in George

Bernard Shaw's play Pygmalion. To our amusement, she comes to be taught to speak

with an upper-class accent. More than a take on the class system, or some vague kind

of social experiment, Weizenbaum’s program was functionally and communicatively

based upon the psychotherapeutic style of Carl Rogers.

Empathy from a Client centred standpoint is the counsellor's ability to understand the

clients emotional experience 'as if' they are the client, so from their understanding of

the world. The 'as if' is important. As a counsellor we need to be aware of our

standpoint, our view of the world, and the emotional/logical to us response that an

168
Weizenbaum, J., "ELIZA -- A computer program for the study of natural language communication
between man and machine", Communications of the
ACM 9(1):36-45, 1966.

026463 158
experience similar to that of the client would be in order to be able to allow us to put

it to one side and respond to & from the clients experience and emotional/logical

response. Indeed it requires so little code (it can be written in 200-270 lines of the

BASIC programming language). By simple pattern recognition and substitution of

key words into canned phrases, it could evoke such responses. According to

Weizenbaum, Eliza’s language abilities could be "incrementally improved" by

various users. It was so convincing, however, that there are many anecdotes about

people becoming very emotionally caught up in dealing with ELIZA. As it probed

and prompted with open-ended questions it encouraged the user to discuss his or her

emotions. All this was due to people's tendency to attach to words meanings which

the computer never put there.169 Weizenbaum reported that "some subjects have been

very hard to convince that ELIZA (with its present script) is not human."

We can contrast this with this quote:

"I got my first glimpse of artificial intelligence on Feb. 10, 1996, at 4:45 p. m.
EST, when in the first game of my match with Deep Blue, the computer
nudged a pawn forward to a square where it could easily be captured. It was a
wonderfully and extremely human move. … Later I discovered the truth.
Deep Blue's computational powers were so great that it did in fact calculate
every possible move all the way to the actual recovery of the pawn six moves
later. The computer didn't view the pawn sacrifice as a sacrifice at all. So the
question is, if the computer makes the same move that I would make for
completely different reasons, has it made an "intelligent" move?" Gary
Kasparov170

Henry Dreyfus takes even Weizenbaum to task for not going far enough to find the

ultimate dividing lines between everyday understanding and natural language

communication, on the one hand, and a context-free logicality and formalizable

intelligence, on the other.171


169
In Computer Power and Human Reason: From Judgment to Calculation, Wiesenbaum explains the
limits of computers, as he wants to make clear in people's minds his opinion that the anthropomorphic
views of computers are just a reduction of the human being and any life form for that matter.
170
Gary Kasparov in response to Deep Blue as a chess partner: "The Day that I sensed a new kind of
intelligence," Time Magazine, March 25, 1996, p. 55
171
Hubert L. Dreyfus, What Computers Still Can't Do: A Critique of Artificial Reason (Cambridge,

026463 159
“The whole script [for ELIZA's transformations] constitutes, in a loose way, a
model of certain aspects of the world. The act of writing a script is a kind of
programming act and has all the advantages of programming, most
particularly that it clearly shows where the programmer's understanding and
command of his subject leaves off.” 172

This is why, as Sir Frederic Bartlett demonstrated in his book Remembering (1932),
no two people will repeat a story they have heard the same way and why, over time,
their recitations of the story will diverge more and more. No ‘copy’ of the story is
ever made; rather, each individual, upon hearing the story, changes to some extent –
enough so that when asked about the story later (in some cases, days, months or even
years after Bartlett first read them the story) – they can re-experience hearing the
story to some extent, although not very well (see the first drawing of the dollar bill,
above). Steven Rose pointed out in The Future of the Brain (2005), a snapshot of the
brain’s current state might also be meaningless unless we knew the entire life history
of that brain’s owner – perhaps even about the social context in which he or she was
raised. Anthony Chemero of the University of Cincinnati, the author of Radical
Embodied Cognitive Science (2009) – has now completely rejected the view that the
human brain works like a computer. The mainstream view is that we, like computers,
make sense of the world by performing computations on mental representations of it,
but Chemero and others describe another way of understanding intelligent behaviour
– as a direct interaction between organisms and their world.

Learning by doing in innovation

Learning by doing is self-explanatory as a term but it carries two more specific

definitions which are in themselves related. As both are relevant to the present study

it is worth sharing them. The first comes from economics where 'learning by doing'

(Arrow, 1964, Rosenburg, 1982; Von Hippel and Tyre, 1995; Fleck, 1994, and

Andersen & Lundvall 1988). Here it is a concept which describes how productivity is

achieved through practice, self-perfection and minor innovations. An example is a

Mass: MIT Press, 1992), pp. 65ff., 219ff., and 314


172
Weizenbaum, “ELIZA,” pp. 39 and 43

026463 160
factory that increases output by learning how to use equipment better without adding

workers or investing significant amounts of capital. The other addresses the more

general process of learning, in particular experiential learning defined as "learning

through reflection on doing".173 John Dewey’s (1933) description in How We Think.

Dewey originally defined reflection as the ‘‘active, persistent and careful

consideration of any belief or supposed form of knowledge in the light of the

grounds that support it and the further conclusions to which it tends’’ (p. 9) 174
It has

other definitions or is closely related to other learning concepts such as ‘hands-on’,

‘activity’ ‘adventure’ ‘deeper’ ‘free choice learning’, ‘cooperative’ ‘participative’

learning. MacKenzie and Wajcman (1985: p.10) describe learning by doing as

"feedback from experience of use into both the design and way of operating things."

A valuable definition of reflection comes from John Dewey’s (1933) description in

How We Think. Dewey originally defined reflection as the ‘‘active, persistent and

careful consideration of any belief or supposed form of knowledge in the light of the

grounds that support it and the further conclusions to which it tends’’ (p. 9) Schon

succinctly summarizes what he found in his studies of professionals’ reflection:

There is some puzzling or troubling or interesting phenomenon with which the

individual is trying to deal. As he tries to make sense of it, he also reflects on the

understandings which have been implicit in his action, understandings which he

surfaces, criticizes, restructures, and embodies in further action (Schon, .1983, p. 50).

Schon refers to this cycle of ‘‘appreciation, .

action, and reappreciation’’ as a process central to the artistry of practice.

173
Felicia, Patrick (2011). Handbook of Research on Improving Learning and Motivation. p. 1003.
174
McKenna (1999a) describes an exchange between an instructor and a student in which the absence
of specific language to talk about the skills of reflection resulted in a breakdown of communication
and learning. In this case, the instructor kept insisting that the student reflect ‘‘deeper,’’ while the
student struggled to figure out what ‘‘deeper’’ meant. “There is an understandable tendency in the
thinking about reflection to avoid being so specific in describing the process of reflection that it
becomes constrained and systematized. In an effort to argue against a technical rationality view of
practice, this caution is warranted; it is, however, difficult for novices to learn what their instructors
fail to describe.” [J.K. Jay, K.L. Johnson / Teaching and Teacher Education 18 (2002) 73–85] “look
back on events, make judgments about them, and alter their teaching behaviors in light of craft,
research, and ethical knowledge” (p. 70)

026463 161
The context of software use necessarily involves the values of a user and a task. So it
refers to the kinds of reflection that designers can engage in when they develop
interfaces, services and artefacts, and the kinds of reflection that users employ when
they use software and devices to conduct practices in everyday life. But prototypes,
models, tests are simulations – simulations of what would or will happen in real life,
real usage, real implementation. They are by nature, limited and false, designed and
contrived – largely by, paraphrasing von Neuman, “…bluffing, of little tactics of
deception, of asking yourself what is the other man going to think I mean to do.”
"Simulation elements selectively represent objects or situations, and selectively
represent user interaction. Simulation elements enable discovery, experimentation,
role modeling, practice, and active construction of systems, cyclical, and linear
content. Which means they enable a transferability to the real world." p. 81 - p. 84 -
Will Thalheimer explains: "The first thing that makes simulations work is context
alignment. The performance situation is similar to the learning situation... If you go
into that situation context again, it allows you to search memory more effectively."

Klein, Gary (1998). Sources of power: How people make decisions. Cambridge, MA:
MIT Press.
Schrage, Michael (2000). Serious play: How the world's best companies simulate to
innovate. Boston: Harvard Business School Press.
Sterman, John (2000). Business dynamics: Systems thinking and modeling for a
complex world. Boston: Irwin/McGraw-Hill.

3. Pedagogical Elements
p. 88 - "... pedagogical elements are learning objectives, the reasons for building the
simulation, and deciding what to simulate."

p. 89 - "Pedagogical elements include background materials... scaffolding...


diagnostic capabilities... tests and quizzes... coaching... " and more

p. 92-93 - Interesting discussion of universal truths and examples (i.e., races, chases,

026463 162
etc.)

Play

Homo Ludens by Johan Huizinga.175 In it He argues that play is not only a defining
characteristic of the human being but that it is also at the root of all human culture.
He claimed that play (in both representation and contest) is the basis of all myth and
ritual and therefore behind all the great “forces of civilised life” law, commerce, art,
literature, and science (Huizinga, 1950). Roger Caillois, in his 1958 book Man, Play,
and Games, provided a number of categorisations of play which have proved highly
influential in game studies176. Caillois’ concepts of paidia and ludus are defined as
follows:

[Paidia is] an almost indivisible principle, common to diversion, turbulence,


free improvisation, and [in which] carefree gaiety is dominant. It manifests a
kind of uncontrolled fantasy that can be designated by the term paidia. [45,
p.13]

[Ludus concerns] a growing tendency to bind [play] with arbitrary,


imperative, and purposely tedious conventions. [45, p.13]

These categories of paidia and ludus have been adopted by many researchers as
fundamental principles of play. Jesper Juul does not use the terms explicitly, he has
written on the distinction between games which emphasise emergent play (paidia),
versus those which emphasise progression (ludus) 177. In this case, Juul has tied the
two categories to forms of game, rather than play as interaction.

Explicit use of Caillois’ terms is also frequent. Gonzalo Frasca is a major proponent
of this understanding of play, writing on a number of occasions to connect it with

175
Huizinga, J. (1950). Homo Ludens: a study of the play element in culture. Boston: The Beacon
Press.
176
CAILLOIS, R. Man, Play, and Games. The Free Press of Glencoe, 1961.
Translated by Meyer Barash.
177
JUUL, J. The open and the closed: Game of emergence and games of progression. In Computer
Game and Digital Cultures Conference Proceedings (2002), F. Mayr ¨ a, Ed., Tampere University
Press, pp. 323–329

026463 163
video game play specifically [96, 98, 100]178.179 As we have already discussed, Frasca
is concerned with ideology in video games, and uses the concepts of paidia and ludus
in order to explain how players are subtly persuaded by game designs [100]. Frasca
characterises Caillois’ terms as describing “the difference between “play” and
“game”” [100]. Specifically addressing beliefs about preferable conduct, Frasca
briefly considers how the paidia and ludus influence players’ choices. In a game
premised on ludus, he suggests that “by stating a rule that defines a winning scenario,
the [designer] is claiming that these goals are preferable to their opposite” [100]. A
game premised on paidia might be seen as one of freedom from preferable conduct
as defined by the game, Frasca notes, however, the presence of “manipulation rules”
which help to define the possibilities in the game and by which “the designer could
convey his ideology by adding or leaving out manipulation rules” [100]

Donald Schon told us that the real social and economic world is populated by
reflective practitioners and this can include even whole organisations, in other
words, individual can reflect on what they are doing or done and so can
organisations. It becomes a part of their repertoire of scripts as an individual, and as
part of the organisational culture or mythos. Donald Schon concludes that "the
reflection-in-action of managers is distinctive in that they operate in an
organizational context and deal with organizational phenomena," which involves
"system of games and norms which both guide and limit the directions of
organizational inquiry" (1983, p.265)

These entities often look for heuristics or otherwise engage in activities on how to
precede from where they are, to where they want to go, the teleology of their
product, service, task. They want to reduce risk in their endeavours, and it is not
unreasonable that they believe they can do this by following recipes, or using tried
and tested ingredients or components, or those recommendations coming from
experts. In absence of this, they give it their best shot, and in a world of imperfect
178
FRASCA, G. Ludology meets narratology: Similtude and differences between (video) games and
narrative. Available online at Ludology.org (http://www.ludology.org/articles/ludology.htm), 1999.
Last accessed 1 October 2007. Finnish version originally published in Parnasso #3, Helsinki, 1999.
179
FRASCA, G. Videogames of the oppressed: Videogames as a means for critical thinking and
debate. Master’s thesis, Georgia Insitute of Technology, 2001. Available online at
http://www.ludology.org/articles/ thesis/, last accessed 1 October 2007

026463 164
knowledge, they use their own hunches, past experiences or gut feeling and intuition.
Opportunities and predicaments can also arise which are not of their own making,
may not accord with, or come under their direct control. Such goings-on can appear
like a jack-in-the-box totally unsettling plans and directions and placing new
pressures and conditions upon business as usual and the homeostasis of everyday
life. They can be threats or opportunities. It is desirable to develop enhanced
capability to discover and correct errors before they escalate to crisis level, such as a
costly post-manufacturing product recall or even an industrial disaster. It is one
motivation for testing and re-testing of products aimed at the market and for a
succession of prescriptive management processes and philosophies aimed at
structuring innovation and development. Another is simply understanding the
technology better, better realising its potential uses, gaining valuable marketing and
design insight such as how features and functions may be received by consumer-
users, including possible leeway for technical or human errors.

Contradictory aims and objectives, and unexpected happenings characterise everyday


life, and they most certainly characterise projects within firms, and especially so in
innovative and exploratory projects. Examine the character of human-computer
interactions as typically involving two participants making moves "for completely
different reasons." The participants can be viewed as the designer; the other
‘participant’ can be the user. One outcome can be viewed as the desired task at hand,
and the other with respect to what the device or service does. One concerns the limits
of what people can hear, see, touch and manipulate, and the interfaces to system and
device use. What results is not so much a discussion of the concepts of
anthropomorphism “the projection of human traits onto the nonhuman," and
anthropocentrism, "all nature is constructed for human ends." i.e. The “intelligence”
of computers as an analysis of the losses of meaning humans commonly experience
in interaction with computers.

An assumption of game theory is that there is an availability of pre-defined


outcomes. Yet another is that the overall outcome for all players would be zero at the
end of the game. When we think about game theory one concern is for equilibrium.

026463 165
What will be the equilibrium set of behaviours? Will there be any equilibrium? How
can cooperative behaviours evolve? This defines such work, as does an elevated risk
of failure or compromise for the entire project, or for particular members. The more
‘radical’ the innovation, the more chance of inviting the unexpected, unforeseen, the
‘black swan’, the more risk.

“Think of the terrorist attack of September 11, 2001: had the risk been
reasonably conceivable on September 10, it would not have happened. If such
a possibility were deemed worthy of attention, fighter planes would have
circled the sky above the twin towers, airplanes would have had locked
bulletproof doors, and the attack would not have taken place, period.
Something else might have taken place. What? I don’t know. Isn’t it strange
to see an event happening precisely because it was not supposed to happen?
What kind of defense do we have against that? Whatever you come to know
(that New York is an easy terrorist target, for instance) may become
inconsequential if your enemy knows that you know it. It may be odd to
realize that, in such a strategic game, what you know can be truly
inconsequential.” (Taleb 2007, p. x)

What does it mean that risk is reasonably conceivable? Can risk assessment identify
black swans? And how shall we confront such events – i.e. how shall we manage the
risks related to black swans? This is one reason drawing firms or individuals to align
themselves with others (i.e. Molina, 2003) in order to share risk and to share the
costs of exploration, and to achieve mutually beneficial ends (i.e. a working system,
through which they can conduct their business).

Game theory carries with it assumptions. One assumption is that a player can adopt
multiple strategies for solving a problem. By working with both changes to
technology and knowledge of human factors one can reduce the tension of slow
processing by showing users system activity, for example a watch or hourglass
cursor may be displayed when the system’s busy reading a file, or a barber’s pole bar
turns to reassure that a system has not crashed. This may be preferential and more
cost-effective to produce than an attempt to speed up the processing. Who wouldn’t
like their computer to start-up like a television, as fast as the touch of a button?

And a final assumption is that all players in the game are aware of the game rules as

026463 166
well as outcomes of other players; that players take a rational decision to increase
their profit. The alliances which were formed in this study focussed upon a joint
effort to create technology and media system. The outcomes could not be classed as
zero sum, as clearly some players would conceivably receive much more out of the
experience than would others. Some risked more than others. Also, the game’s rules
were not clearly defined. Attempts to organise and create managerial and strategic
structures were continually challenged by a lack of definition and clearly defined
outcomes. Even the core functioning of what the system provided went through
considerable change, as did other key aspects such as the physical and graphical
interfaces. The endeavour was by nature exploratory and experimental, and the
degree to which it was ‘different’ varied between organisations, as some were used
(i.e. the set top box computer company) to pressurised innovation environments than
others (i.e. the Cable TV service provider). Even the general understanding of the
project and its orientations were understood differently, again this was down to
matter of what can be described as company culture, the manner and means through
which the typically conducted their business, and the view they viewed the market
and their customer base. Abolafia (2010)180 has described the work done by policy-
making groups when they make sense of complex and ambiguous environments
whilst trying to remain consistent and faithful to the group’s institutionalized
operating model? He identifies a social process of pattern recognition involving
abduction, the comparison of culturally approved models to the current conditions to
establish relevant facts and events; plotting, the reordering of those facts and events
into a plausible narrative; and selective retention, the collective negotiation of a
policy choice that fits the emerging narrative.

What did draw them in, what galvanised and conjoined their action and development
effort was a clear notion that this system was the future of television, and possibly
the future of domestic mass digital communications.

Design by committee

180
Abolafia, M.Y. (2010) Narrative Construction as Sensemaking: How a Central Bank Thinks
Organization Studies March 2010 31: 349-367,

026463 167
Situated around a table, an objective or object, device, artefact, a book, a football
team watching a strategy on a white board, a service, a government, a building, a
television program - are those who fund, design, regulate and produce so that bring it
to fruition in the first place. They constitute the social and organisational network or
constituency which, working in concert, consecutively or concurrently supports and
shapes technical and creative development. Eventually there those who amongst
other activities consume, subscribe, use, view, follow, listen, watch, advocate or
vote. Making these two social groups somehow binary may be misleading, as it is
obvious that designers and producers are consumers and users of products and
services as well in their everyday life. Meanwhile, consumers and users of products
may also include industrial or commercial users, or even designers who buy off-the-
shelf or ready-made components from other firms, so-called business to business
customers who appropriate things like video cards, or printed circuit boards, which
they incorporate their product.

Nevertheless for the sake of argument we can say that on the designer and producer
side of the equation the imperative is obviously to make money through profits,
salaries, commissions, licensing etc. if it is a politician we speak of, it is the aim to
get elected by saying the things that he and his advisors, publicists, say that voters
want to hear or will tolerate, and having the financial support to produce the
dominant media campaign. This does not forgo intrinsic motivation, the joy of doing
a good job for its own sake, solving problems as if they were puzzles, the ecstasy or
‘ego trip’ of winning an election and so forth.

On the consumer-user-voter-reader side are those often looking for something quite
different; they want entertainment, information, knowledge, better living standards,
lower taxes and so on, and they are often willing to pay or sacrifice for this or
otherwise show or manifest some sort of allegiance or relationship, from brand
loyalty to joining the political party and voting. Fogg, et al. (2002) The diffusion of
computers has led to new uses for interactive technology; including the use of
computers to change people’s attitudes and behaviour - in a word: persuasion.

026463 168
There are markedly different ways in which this can be achieved, provided and
obtained, as example news via, television, internet, radio, newspapers, playing board
games opposed to video games etc. But we begin to see that even in the example
regarding news; do people really employ economic rationality in terms of getting
news in the most cost-effective manner, or is it in the manner to which they are
accustomed, or simply which experientially suits them better? Clearly not, pondering
the real experience of news other factors come into play such as it is hardly only
news we watch or are exposed to in an evening’s television viewing. In broadcasting
it comes on as part of the procession of the day, the same as I read other
miscellaneous articles of varying relevant to me and my life contexts, community,
location in the newspaper, or the way I look on the computer records for a book in
the library and on finding its location find many others which may be more
compelling or even relevant.

Also, styles of new presentation, even presenters come to feature, curiosity,


entertainment, seeking truth through data and reportage, inquisitiveness to see other
misfortune, our fortune; each has some role to play. The intersubjective aspect of
game theory “asking yourself what is the other man is going to think I mean to do,”
is an important aspect of all such equations. It is reflection for better or for worse, in
the same way as poker players second guess what the other player’s cards are and
what they will do with them. How do designers, in their reflection and ultimately
interpretation of design briefs, component compatibilities and tolerances and their
reflexive musings of what they want to see and realise in their designs think of the
consumer and user? This is the key question here. Schank and Abelson, (1977) 181
developed an idea that was critical to understanding how humans decide what to do
and understand what others do was developed in some detail. The concept of scripts
were intended to account for the human ability to understand more than was being
referred to explicitly in a sentence by explaining the organization of implicit
knowledge of the world that one inhabits. Related to scripts is sense-making which is
described as “a set of assumptions and propositions" about the process through which
people understand and apply meaning to their everyday experiences (Dervin 1992,
181
Schank R. and R. Abelson. 1977. Scripts, Plans, Goals, and Understanding: An Inquiry Into Human
Knowledge Structures. Hillsdale, New Jersey: Lawrence Erlbaum Associates.

026463 169
pg 52).

Sense making is about filling in a gap between the familiar and unfamiliar and the
strategies one makes in order to accomplish this. The ‘ELIZA effect’ has come to
refer to the tendency of humans to attach associations to terms from prior experience.
182

Thus, when John watches television, we assume that he may be at home, sitting on a
sofa or maybe even lying in bed; we know that he is probably using chopsticks and
not a fork; and, we can even assume that he is drinking Japanese beer. We assume
these things because we know the sushi bar script. If we do not know this script, we
cannot make such assumptions and thus might have difficulty understanding various
sentences that refer to things we might be assumed to know. How do we come to
know such scripts? The answer to this is very simple. We learn them. We learn them
by practicing them over and over. We can learn them as children, by being taken to a
restaurant many times and gradually learning each step of the restaurant script. We
can learn them through expectation failure by seeing an aspect of a script fail to be
true (chopsticks instead of forks, for example) and explaining the difference to
ourselves, creating a new Oriental restaurant addendum to that portion of the
everyday restaurant script. And, we can learn them as adults, by, for example, going
on our first airplane ride, trying to understand its script and gradually modifying our
understanding with each subsequent trip.

These examples would serve well the leitmotif of ‘another kind of thinking’ which
stands apart from pure reasoning, or, in fact, any notion that social and experience
can be meaningfully captured, codified purely in numbers, data, inventories,

182
In the feedback of an article on rogerian Therapy one person who had “It has been very frustrating
for me in the past to be in therapy and "pouring my heart out " to some one who just sits there [like a]
blow up doll. In this case you might as well go buy an inflatable therapist if they made one and talk to
it. It would be cheaper and just as effective. With this form of therapy on the other hand you know
there is a real human being at the other end. Someone who is listening and commenting. Someone
who not only acts like but is human. These therapists actually care so as to make you feel like a real
human and not a diagnosis. You are not just another label and what ever insight you bring to the table
is well appreciated. I find this to be non intimidating . When you feel completely at ease it is easier to
talk about your feelings and start to make progress.” [source:
http://www.simplypsychology.org/client-centred-therapy.html] accessed 24/7/2015

026463 170
databases, scorecards and spread sheets. ‘Architectures of control’, is a term most
notably used by Stanford law professor Lawrence Lessig to describe the way in
which systems (such as the internet) regulate and shape users’ behaviour through the
embedded ‘code’ of the system itself, orders of magnitude more powerful than any
external legal regulation. People, managers, politicians and bureaucracies still have
to channel the surveillance and interpret what comes under its quasi-mechanical gaze
and relate this to pre-existing ideologies – data can only become information, or
wisdom, through the process of interpretation. And to interpret, one needs a moral
and ethical framework. Computers by virtue of their very operation track and log the
movement of electrons in their domain. Bits and bytes are accounted for and system-
tracking or system-logging has been realised to lend clues to which areas may be
causing difficulty, or on which particular areas a user chooses to focus:
“System-logged traces of a user's interaction with an application are a
primary source of data for automatic user modelling and for a wide range of
research and evaluation experiments. Such traces are important for these
applications for two reasons. Firstly, and primarily, in many instances this
trace information is the only data available. This is particularly the case in
automatic user modelling, where the user is working freely, observed only by
the system itself. In the case of experiments, other data may be available,
such as video or audio recordings or transcripts of conversation, yet these
records serve to annotate rather than to replace the on-line trace. The on-line
trace remains the most immediate and accurate record of the user's actions …
The second reason why traces are important is the ease with which they can
be collected. Automatic logging can take place unobtrusively in the
background of an application, and can record a wide range of data including
all input, output, display details, state changes and temporal information.
Unlike the other methods mentioned, it does not require the constant attention
of an experimenter and can therefore record `normal' as well as experimental
working patterns. …However these advantages may also create problems in
the use of trace information; in particular it can be difficult to interpret the
traces correctly since the context of an action cannot be observed directly and
must therefore be inferred.” 183

Such a view which translated to today would refer to ‘big data’ and ‘behavioral
Analytics’. This would suggest that your own online behaviour, search choices etc.
would automatically, or rather subconsciously, dictate your predilections and p-
references through induction and inference.

183
A. Dix, J. Finlay and R. Beale (1992). Analysis of user behaviour as time series.
Proceedings of HCI'92: People and Computers VII, Eds. A. Monk, D. Diaper and M. Harrison.
Cambridge University Press. pp.429-444.

026463 171
"Consider this dilemma: while our technological abilities to generate and
disseminate potentially useful data have increased many fold in the past few
years, man's physical capacity to register and to process potentially
informative data has probably increased very little, if indeed at all. The sheer
volume of data that crosses the typical executive's desk today should serve to
spotlight the inadequacies of the education and development of our
acquisition strategies and practices. But no gain in ability could offset the
widening gap between the exponentially-increasing quantity of data available
for consumption and man's very limited capacity for acquiring and processing
useful informational).” (Thayer,1968; p.202) 184

Langdon Winner (1996), in his article about the electronic office he claims that low-
status, primarily clerical workers are highly regimented and their work is tightly
monitored and controlled through computerized control systems:
"As they enter an electronic office or factory, they become the objects of top-
down managerial control, required to take orders, complete finite tasks, and
perform according to a set of standard productivity measures. Facing them is
a structure that incorporates the authoritarianism of the industrial workplace
and augments its power in ingenious ways. No longer are the Taylorist time-
and-motion measurements limited by an awkward stopwatch carried from
place to place by a wandering manager. Now workers' motions can be
ubiquitously monitored in units calculable to the near microsecond." (Winner,
1996, p.83).

Such surveillance was in place in the IT company offered in the case study later, but
suffice to say here is that command and control enters into wider society and
everyday life, not unlike those visions realised in the film Minority Report (2001).
The actor Tom Cruise, who was by all accounts deeply involved in Fox Studios’
development of the film, remarked regarding this in an interview in the June 16,
2002 Boston Sunday Globe. He explained his ideas of the film’s attempt at future
social critique:
"Look where we are right now," Cruise says. "Cameras in the streets. On the
Internet. You know 'Hey, this person went to this site, this site, this site.' They
know what you like, where you go, they know all that stuff. But it's like
anything. There are so many benefits for people. On the other hand, it can be
abused. You've got to have a watchdog watch the watchdog watch the
watchdog." Then Cruise makes a leap, linking the issues in the film to the
issues of the day. The idea of arresting people before they commit crimes
may have seemed outlandish a few years ago when Minority Report was
184
Thayer, L. (1968) Communication and Communication Systems in Organization, Management, and
Interpersonal Relations. Irwin, 1968, p. 202.

026463 172
being developed. But now, faced with a constant stream of terrorist threats,
security alerts, and investigative breakdowns of epic proportions, we find
ourselves perched on the threshold of a Minority Report society. A place
where suspects are held because they may be connected to terror plots. ...”

For this movie Steven Spielberg the producer assembling the think tank of futurists,
To aid in envisioning this future, Spielberg brought together a Think Tank: M.I.T.
scientists such as John Underkoffler, urban planners, architects, inventors, writers
(such as Generation X author Douglas Coupland),

“The gradual loss of privacy was a near unanimous prediction. "The reason is
not so people can spy on you," explains [screenwriter Scott] Frank," but so
they can sell to you. In the not too distant future, it is plausible that by
scanning your eyes, your whereabouts will be tracked. They will keep track
of what you buy, so they can keep on selling to you … George Orwell's
prophecy really comes true, not in the twentieth century but in the twenty-
first," the director explains. "Big brother is watching us now and what little
privacy we have will completely evaporate in twenty or thirty years, because
technology will be able to see through walls, through rooftops, into the very
privacy of our personal lives, into the sanctuary of our families.”"185

Just as there are those who predict that the Internet will liberate relationships and
engender community, there are those who view online relationships as shallow,
impersonal, and often hostile, what Kling (1996a) describes as the "comparably
dark" dystopianism which views technology as a vehicle to exacerbate human
suffering.

Karl Popper's metaphor of clocks and clouds describes the two ends of the spectrum
of determinism and predictability in social science: clouds represent the disorderly
and irregular, chaos and emergent, and clocks represent the predictable and rational,
the controllable and cyclical. Clocks are manmade and very determinate systems –
we can dismantle them, study them, understand how they work, reengineer and put
them back together again. Clouds on the other hand are very indeterminate. We
cannot really understand them by taking them apart, we have to understand and study
the ‘system’ as a whole if we want to glean any valuable information. Between
clocks and clouds, there is a whole scale of things that are more or less

185
http://www.cinemareview.com/production.asp?prodid=1750

026463 173
determinate/indeterminate.186 In a sense this also relates to the modes in which we
‘understand’ managerially or policy relevant ‘data’ (clocks, work, man-made, linear
systems, rationale) opposed to entertaining games, literature and movies (clouds,
play, nature, organic systems, intuition and feel).

Jerome Bruner tells us that storytelling or constructing narratives lends us the ability
to communicate beyond that of logic or inductive arguments, that “world making”
are the principal function of mind. This does not matter whether this is regarding the
sciences or the arts. (Bruner, 1986) It also affects attitudes in management and
perceptions in business. Managers involved in developing new applications and
hardware have to be driven by some sort of rationalisation regarding the use of their
product and, even by extension, its users. These rationalisations, however may be
more clouds and clocks. To have an identity as a person, says the philosopher Marya
Schechtman, is "to have a narrative self-conception... to experience the events in
one's life as interpreted through one's sense of one's own life story". A list of other
notables including Alasdair MacIntyre, Charles Taylor, Paul Ricoeur, Daniel
Dennett and many others support the further claim is that we create or invent the self
specifically by "writing" and "storying" it. The existentialist philosopher Jean Paul
Sartre remarked that “A man is always a teller of stories. He lives surrounded by his
own stories and those of other people. He sees everything that happens to him in
terms of stories, and he tries to live his life as if he were recounting it.” 187 We can
project our own story and narratives, replete with our biases, preferences and
prejudices, on to others unwittingly or semi-consciously. The problem only arises
when the logics therein do not graft or map fully and people fail to ‘see’ what the
designer saw, or even have a poor or reduced experience of using. While the human
elements are focussed upon, the humanistic values seem somewhat missing. To
highlight this, Anthony Dunne in Hertzian Tales (1999) highlights how design is
critical for examining culture and ethics. He is frustrated that designers do not do
more with developing electronic products and wishes that products were more
socially beneficial, rather than focussed upon purely business or technical benefits.
186
source: http://www.the-rathouse.com/2011/Clouds-and-Clocks.html#sthash.joaujjhr.dpuf
187
Sartre 1964, cited in Bruner 1987, p. 5.
Sartre, Jean-Paul 1964, The Words, Braziller, New York. Bruner, J. 1987, ‘Life as Narrative’, Social
Research, 54, pp. 1–17

026463 174
Dunne talks about how electronic objects should be ‘poetic’, ‘surprise’ people, and
encourage ‘skepticism’. Even today, social networking hardly does this.

In the mid-1990s Wes Sharrock and Bob Anderson produced an important paper
which was not aimed at an instrumental view of design, but rather on how users enter
in to design reflections and interpretations. The spoke most evocatively of ‘The user
as a scenic feature of the design space’ (Sharrock and Anderson, 1994). They put
forward that;

“… differently schooled participants will see the object of design differently


according to their special interests. Each will work out the design task
according to their design task, relying on different kinds of models, theories,
tools, constraints. Each will work within a different 'object world': a world of
technical specialization, with its own dialect, system of symbols, metaphors
and models, instruments and craft sensitivities.” (p.7)188

In particular they drew attention to how: “The user is introduced into design through
the use of typificatory structures.” Right throughout the creative and innovation
process, the battle to make things join and fit together, to get to work and operate,
engineers, designers and their managers are haunted by ‘the user’ and how they will
react to characteristics, feature and functions as they develop and add to the
repertoire of what something does. Engineers, designers and their managers must
employ this thinking, as they know they are designing for ‘others’, even ‘others’ who
may be quite unlike themselves in taste, in interests, and tolerance to unwieldy
functioning. In The Power of Interest for Motivation and Engagement (2015) ,
Eugene Lang and Suzanne Hidi draw on research demonstrating that people’s
interest in something depends on the quality of their experiences with it—and, that
people at all ages can be supported to develop new interests, and it is a benefit to
them to do so. As they explain, “interest has repeatedly been found to positively
influence attention, goal setting, and learning strategies. The development of interest
is related to the reward circuitry of the brain; once interest is supported to develop,
the search behaviors that are associated with gathering and understanding new

Sharrock, W. Anderson, B. (1994) The user as


188
a scenic feature of the design space
Design Studies Vol 15 No 1 January 1994 pp.5-18

026463 175
information become rewarding.189 Sharrock and Anderson quote Donald Schon in his
assertion of the rules of the ‘game’ of design:
“We believe that the rules employed in design reasoning are derived from
types. As rules of law are derived from juridical precedents, so design rules
are derived from types, and may be subjected to test and criticism by
reference to them. Moreover, a designer's ability to apply a rule correctly
depends on familiarity with an underlying type, by reference to which the
designers judges whether the rule 'fits the case' and fills the inevitable gap
between the relatively abstract rule and the concrete context of its application
as a type. There is a two-way interaction between design types and design
worlds. On the one hand, elements of a design world may be assembled to
produce an artefact that comes to function - either in the practice of an
individual designer or a larger design culture - as a type ... On the one hand,
the direction of causality may be reversed. A vernacular building type .., may
'loosen up' to provide the furniture of the design world.”190

Types play a large role in the lifeworld of a designer and engineer. Of course they
feature strongly in the design of software or hardware. Designers or engineers
building ICT electronic devices make selections of components, i.e. resistors or
capacitors of certain specifications (indicated by different colour coding or writing
on the casing] Designing or programming is a matter of judiciously selecting,
sometimes trying, one component over another in order to see which fits better given
a certain desired outcome. But experience has grouped particular generic functions
which can become a building block in an electronic circuit. In software terms the
class-based object-oriented programming paradigm, "object" refers to a particular
instance of a class where the object can be a combination of variables, functions, and
data structures. It is these structures, treated more like devices, that are incorporated
into circuit than single lines of code being written which would be prohibitively time
consuming and monotonous. It can be used as a building block in software
development the same way as a microprocessor, an entire circuit in itself,
incorporates into a device. The microprocessor, the processing heart of all ICTs was
developed as a single integrated circuit chip containing millions of very small
components including transistors, resistors, and diodes.

189
The Power of Interest for Motivation and Engagement Hardcover – 15 Dec 2015
by K Ann Renninger (Author), Suzanne Hidi (Author) London: Routledge
190
Schon, D (1988) 'Designing: rules, types and worlds' Design Studies Vol 9 No 3 pp. 181-90 quoted
in Sharrock and Anderson (ibid, p.8).

026463 176
Clearly the typificatory structures or imagination employed by designers for their
technical work is very different indeed to that employed when they resurrect ‘the
user’. But the ability to balance the proper and adequate functioning of the device at
hand, specified by the design brief, with the experience of using by an imagined
novice user is something which is craftily used in the design process. The next level
would be of course to test these assumptions, thus the usability test. Jakob Nielsen
perhaps defines the orthodox usability principles that dominated and guided studies
at this time. These were learnability, efficiency, memorability, error prevention, and
satisfaction191. Of these five principles, only one straightforwardly applies to video
games: satisfaction. Video games are frequently difficult to learn (complex control
systems), demand inefficient solutions to problems (crossing vast territories
repeatedly), challenge the player’s memory (including explicit tests of memory), and
push players into errors intentionally (mistimed jumps, death, and so on). Further, all
of these “unusable” aspects of video games are in the name of fun.

Decision making in design and innovation

In the explanation given above, it is clear how the previous and proven functions of
component elements such as microprocessors and objects in object-orientated
programming clearly shorten development times for new uses, and provide a reliable
bug free basis which also lessens the risk in projects. However, they also clearly
reduce the advent of radically new innovations which would require say new
programming languages or radical modification to the basic specifications and
functions of the clusters of components. As such microprocessors become ‘the
received’, or the existing paradigm through which work is committed. As such they
must try to remain as generic as possible dealing for instance with different issues of
control, circuit power, memory etc.

This situation makes for the path-dependencies, and socioeconomic paradigms which
perpetuate the status quo, not necessarily abating innovative activity, which would be
a pessimistic attitude, but rather more optimistically, provides foundation,
191
NIELSEN, J., AND MACK, R. L., Eds. Usability Inspection Methods. John Wiley & Sons, Inc.,
1994.

026463 177
enablement, and short. The creation of such devices ultimately channel what comes
forth while also lending dependency, substance and direction. In an early paper on
this subject David (1975) found ample economic reason for path-dependency in
technological development. Dosi had earlier put this most forcibly: "The history of
technology is contextual to the history of the industrial structures associated with that
technology." (Dosi, 1982: p.147) Hughes (1983) speaks of the accumulation of
knowledge, funding, educational resources and so forth that propel a system
technology towards institutionalisation, apparently under its own 'momentum'. This
context or constituency of social, economic and knowledge elements were expanded
upon by Rycroft and Kash (1999: p. 162) when they suggested there are at least five
major sources of path-dependence namely, culture, organisational learning,
technological compatibility, standards-setting processes, and industry strategy and
public policy. Nathan Rosenberg (1994) put forward that the frameworks which are
developed by major innovations, such as microprocessors and new programming
languages constitutes the initiation of path-dependent activities which may extend
over decades, and by which later developments cannot be understood except as part
of an historical sequence:
". . . there is always a huge overhang of technological inheritance which
exercises a powerful influence upon present and future technological
possibilities. Much technological progress at any given time, therefore, has to
be understood as the attempt to extend and further exploit certain trajectories
of improvement that are made possible by the existing stock of technological
knowledge. There are continuities of potential improvements that are
generally well understood by engineers and product designers. Expert
knowledge of the working of the vacuum tube did not provide an adequate
basis for a "discontinuous leap" to the transistor. However, once the transistor
was invented, it created a set of opportunities for further improvement by
pursuing a trajectory of miniturization of components (including integrated
circuitry) which has occupied the attention of technical personnel for nearly
half a century." (p.16)

Again, this does not suggest the inevitability of certain innovation events and
processes, but rather that certain orientations are easier than others in terms of cost
and development and so place certain delimitations on the universe of choice.
Rosenberg also argues that technical knowledge is cumulative, and this gives rise to
a certain degree of ‘soft determinism' (i.e. Pool, 1983; Grint and Woolgar, 1997)

026463 178
concerning the probable way in which a technology may develop.192

Following a similar line to both Hughes and Rosenberg are a number of others who
identify with the complex of forces that constrain and drive innovation, predicating
and biasing development towards certain directions. While maintaining specific
meanings and placing different emphases on technology social or economic factors,
they all refer to the way in which innovation activity is constrained. Dosi (1982) with
technical paradigms; Nelson and Winter (1977) with technological regimes; Sahal
(1981) with technological guide posts and Utterback (1979,1994) with dominant
design; Georgiou et al. (1986) with technological corridors - each are suggesting that
the multiplicity of choices at the onset of a development closes according to similar
forces. Put simply the suggestion is as Bucciarelli (1994: p.198) sees it, and which is
a pretext to the contexts of design and use, that: "No design begins with an absolutely
clean sheet of paper. Last year's model, a competitor's product, the availability of
new tooling, new materials . . . are situated within historical, political and cultural
contexts."

One explanation of why trajectories and path dependencies exist can be found in the
idea of absorptive capacity (Cohen and Levinthal, 1990). A firm's absorptive
capacity is directly related to the knowledge the firm possesses and their experience.
It is through absorptive capacity that a firm can understand and recognise the value
of a new technology. Thus, the existing technologies in a firm give that firm
knowledge and experience which is then used in understanding and evaluating new
technologies. A new technology that lies too far outside the firm's technological
paradigm will be difficult to understand because the firm will not have the necessary
knowledge. A similar frame of understanding can be levelled at consumer-users.
People invest time into establishing and refining behavioural routines (set allocations
of time) through the process of learning and negotiation with the institutional
frameworks that they exist in. As Shackle, (1963) has it: "Each of us builds the

192
Heilbroner (1972/1994) recognises that technology itself is the result of social activity, and that the
nature of technological advance is responsive to social influences. However, he goes on to argue that
under certain conditions technology can even become autonomous and itself determine advance - such
as when the forces of high capitalism have been unleashed. In such periods technology is allowed to
determine progress.

026463 179
unique structure of his or her personal existence out of countless stereotyped patterns
of action." (p.18) Radically new products, perhaps too far removed from obvious or
familiar function, will not be attractive to all but the most keen of early adopters.
Most will appear of little value, even of 'no use'.193

Rosenberg (1994: p.5) asserts that for firms; "acquiring new information is costly,"
and this is true whether this is data regarding new technology, the development or
recruiting of new expertise; or for learning about consumers and markets. New
knowledge may be even more costly. Sometimes it is easier to justify and invest in
software expertise, or new plant than in market or consumer research. Path-
dependencies, technological trajectories etc., while positive and negative with respect
to innovation (i.e. they can constrain as well as create the conditions for opportunity
and creativity), they certainly do create distance between 'cultures of production' and
'cultures of use'. Their influence forms a considerable part of the 'technical thinking'
Aryaya (1995) which defines the problem solving process of technology
development, and what Levitt referred to in his brilliant 1960 article Marketing
Myopia:
“Consumers are unpredictable, varied, fickle, stupid, shortsighted, stubborn,
and generally bothersome. This is not what the engineer-managers say, but
deep down in their consciousness it is what they believe. And this accounts
for their concentrating on what they know and what they can control, namely
product research, engineering, and production. The emphasis on production
becomes particularly attractive when the product can be made at declining
unit costs.” (p.41)

But again, it is important to stress that myopia, using tried and tested solutions, bits
and pieces, mash-ups of what worked before do act to constrain the universe of
possibilities of what may be available. They can short-cut development time, they
can save money, they can ensures for consumers a succession of devices which
[sometimes] offer some logical form of improvement over previous models, versions
etc. Thus we have successive versions of windows, successively more powerful Intel

193
Yves Punie is one a only a few researchers who have confronted the notion of non-use in a direct
manner. In his paper ' Imagining "non-uses": Rejections of ICTs in Flemish households' (1997) he
found that when people were asked to respond to the open-ended question why they do not own or use
particular ICTs, their main reply was that they had ‘no need’. Other factors such as ‘financial
constraints’ were represented only minimally, both in relation to the costs of the services, as in
relation to their own scarce money resources.

026463 180
chips, successfully more powerful video cards, and successively more graphically
powerful [and realist games].

Are interfaces, physical and graphical, going through similar reiterations, similar
successive generations of improvement? The answer is clearly no. This is an
important characteristic between the design of functions, and the design of use and
usages. It is something of a commentary regarding how users have to adapt to
technology and its path-dependencies, and how they do, how they can. Technology,
and even those devices boasting ‘smartness’ or ‘intelligence’, hardly accommodates
people in its entirety, it only partially works for them in all their wondrous
unpredictability, individuality, and hopes, aspirations and fears. Designers and
engineers cannot simply take components off a shelf and take for granted it will be
used in concordance with their ideas, designs if you want, regarding function. This
does not mean to say that people do not buy Bluetooth standard QUERTY keyboards
to use with their iPADs so that they can better word process. Designers and engineers
can today use certain heuristics and ‘tips’ regarding how to craft a ‘good’ website
interface, their choice and selection of components, colours, layout etc. may
compliment the bulleted points. But it remains that they cannot design usability
outright; it remains an experience with the minds and lives of users. The focus of
design practice should be the study of people and how they live. This, however, is
not to adopt the essentially Taylorist model of ‘user-‘ centricity (Shove et al 2007).
The discourse of ‘user-centered’ design has had the tendency to take the form of a
reductivist anthropology, which renders the imagined user primarily as a consumer,
or as a form of reactive mechanism constituted by their relationship to ergonomic
stimuli, which can be studied through a form of applied psychology (Almquist &
Lupton 2010; Jordan 2005)194.

Scripts
And, we can learn them as adults, by, for example, going on our first airplane ride,
trying to understand its script and gradually modifying our understanding with each
subsequent trip.
194
Almquist, J. & Lupton, J. (2010) ‘Affording Meaning: Design-Oriented Research from the
Humanities and Social Sciences’, Design Issues, Vol 26, Number 1, pp. 3-14

026463 181
It is this last aspect of the learning process that is most important for education.
When we go on an airplane trip for the first time, or indeed, when we do anything for
the first time, we are highly dependent upon finding a reminding, that is, finding
some prior experience that will help us understand the current situation. Reminding
is the process by which case based reasoning takes place. When we attempt to
understand anything, we do so by attempting to find something in our memories that
looks sufficiently like it so as to be helpful in processing. The reminding process
allows us to learn by causing us to constantly compare new experiences to old ones,
enabling us to make generalizations from the conjunction of the two experiences.

Now, one of two things happens during this comparison process. Either we realize
that the new experience is significantly different from the one that we have compared
it to, or we realize that it is really very much like it. (I will ignore gray, in between,
cases here.) When a new experience is found to be different from our prior closest
memory, we must create a new case for it. We can use our prior knowledge of trains
to help us out on our first airplane ride, but we soon realize that while the comparison
may have been helpful for initial processing, airplanes are cases of their own. We can
index airplanes in terms of trains, but eventually we will treat them as a new thing
entirely.

We may not know to do this initially, of course. How can we know on one airplane
ride not to treat it as a specialization of train travel? But on our tenth airplane ride,
we will cease to need that comparison. Instead, in trying to compare each airplane
ride to each other, we will have created an airplane script that predicts what airplane
rides are like in general, including information that states that one should not expect
much of a meal. This is, of course, the other aspect of the comparison process.
Finding a new experience to be a lot like an old experience allows us to build the
script.

So, we either use new cases as new material to add to our library of cases or we use
new cases to help build up our detailed script knowledge. We can, of course, decide

026463 182
that our new case is of no interest whatever because it is exactly what we have
experienced many times before. In that instance, no learning occurs at all. We shall
consider further the significance of new case acquisition within the learning by doing
context later on. Now, we need to discuss further the significance of scripts in
learning by doing.

Changing the device, its use or its context?

Social scientists and particularly scholars in science and technology studies (STS),
and other fields, have attempted to explain more fully the origins, contexts,
contingencies or subtleties of social and material culture when things are developed
and built. Firms commit to actions and employ methods to guide and inform strategy,
design, marketing and managerial decisions.

A spider conducts operations that resemble those of a weaver, and a bee puts
to shame many an architect in the construction of her cells. But what
distinguishes the worst architect from the best of bees is this, that the
architect raises his structure in imagination before he erects it in reality. At
the end of every labour-process, we get a result that already existed in the
imagination of the labourer at its commencement.195

That these actions and decisions rest upon epistemic, cultural and ethical foundations
themselves grounded in methods and theories such as usability engineering, or
management approaches and ‘fads’, such as total quality management, Quality
Function deployment, business process reengineering and so forth.

In my reading of the STS work I am particularly drawn to how their stories impinge
upon ideation and communication, design and implementation. These are the stories I
find most compelling, some of them are concerned with how people in everyday life
apprehend technology, how they negotiate the value and use of things by developing

195
Fischer, E. (1996) How to Read Karl Marx (Monthly Review Press, 1996), p52.

026463 183
behaviours and values and meanings for things. This would include the
aforementioned typologies, formally or informally used and applied in more ad hoc
manners as designers and system integrators go about their business. Those people
who are not involved directly with design and production, authorship, composing,
censorship, editing, crafting, making, even leading are often given for the purpose of
discussion and discourse the title viewers, listeners, audiences, users, consumers,
subscribers, subjects, citizens and so forth. We should consider each as a loaded
term, as each presents particular roles or affordances, aspects of agency, passivity or
activity, just as devices and designed and seen to do or are experienced. There is an
instrumental dimension which clarifies as it precipitates to the crowd or mass level.
As the sample size gets larger, its smooths out the burrs, variations and
idiosyncrasies of the individual players. From irrational individual action based
upon idiosyncrasies and individuality, to the herd mentality, the group think, mass
production, mass society of big data. The rational decision maker of the economist is
the classic straw man who behaves in rational reified manners which belie his or her
humanity, its individual conditions, serendipitous manners, or capricious choices.
While they may not be a pawn as such, they are objectified and incapable of true
decision making, feelings, spirituality or anything else which defines the wider
human condition. Or the interaction choices are crafted as such making them, or at
the very least making them statistically appear that way.

That we exist, that we consciously and purposefully act upon, interact with, and
otherwise live amongst a plenitude of technologies and other consciously designed
artefacts, materials and spaces, has affected and informed human sociology, as well
as ontology and epistemology. This is surely the kernel of the agency-structural
model of Antony Giddens (1979). Models of reality, what we know, how we learn,
how we make sense of, and what we consider important and valuable to know and
do, has, at least since the time of human settlement, been largely shaped by ourselves
and moreover our interpretation of the words and actions, deeds and artefacts of
other people in the conductance of everyday life and from within its physical,
symbolic and conceptual structures. It typically entails particular self-reflective
recognition of the place of the self, the individual and the wider social and physical,

026463 184
then comes the abstract, the expression of the imaginary, symbolic and virtual, the
man-made, the artificial or synthetic [and thus planned and designed, the constrained
and kept, plastic], and natural [unplanned and found, the wilderness, nature].

“Human beings today [...] are surrounded by huge institutions we can never
penetrate: the City [London's Wall Street], the banking system, political and
advertising conglomerates, vast entertainment enterprises. They've made
themselves user friendly, but they define the tastes to which we conform.
They're rather subtle, subservient tyrants, but no less sinister for that.” 196

The field of semiotics dictates that each action, every perception, constitutes some
act of interaction between the environment of people, objects, nature and symbols
(i.e. Eco, 1976; Fiske and Hartley, 1978). According to the child developmentalist
Piaget such social learning goes through successive phases as the child grows to
adulthood. Development after birth is dependent upon the reception and mimicking
of structures and sounds which are repeated and learned, repeated and learned.
Newborns come equipped with powerful learning mechanisms that allow them to
change rapidly so they can interact increasingly effectively with their world, even if
that world is unlike the one their distant, or even more recent ancestors faced. That
these sounds and symbols have meanings is the very foundation upon which
structures make sense, become to some degree indelible in our neural pathways, and
find meaning in our experience. 197

An experience is always what it is because of a transaction taking place between an


individual and his environment" Dewey, 1938, p.43 [NOTE:this is a different view
of experience than that based upon past experience] The process of identifying
physical objects and environments, things, words and contexts is such a fundamental
part of our experience that we seldom think about how we do it. It can be
subconscious and subliminal, as well as obvious and stated. We use our senses, of
course: we look at, feel, pick up, shake and listen to, smell, and taste objects until we
have a reference for them - then we give them a label which has been socially agreed
upon, assigned and accepted. The whole process relies on sophisticated work by our

196
J.G. Ballard "Kafka in the Present Day" [London] Sunday Times (c.1993) republished in A User's
Guide to the Millenium (1996)
197
https://aeon.co/essays/your-brain-does-not-process-information-and-it-is-not-a-computer

026463 185
brains and bodies, and those who explore and tinker with computer vision or
artificial intelligence will relate that ‘teaching’ a computer to recognise physical
objects is no small feat, let alone realise physical objects, to think of them as
something relevant for, and relevant to their future and their teleology. But humans
treat the social, the material, and other categories such as ‘the economic’,
‘knowledge and ‘’know-how’, the ‘symbolic’ or ‘cognitive’, as abstract, natural or
artificial. Many of us do not live ‘in the now’ but also in ‘what will happen next?’ As
living beings we have used time and energy to make things exist, both tangibly and
abstractly. So this is also a commentary regarding the corpus and complexity of
many of the material things and thoughts and ideas, experiences, which populate
everyday life. Even though they may be regarded as mundane and ordinary, even
subliminal, marginal and arbitrary, they are in their use, for their purpose or ‘fit’,
anything but. The basic model of learning which has wide acceptance is Kolb’s
experiential learning model. This is a circular argument and it has arisen in felid
such as Activity Theory (Kaptelinin and Nardi 2006; Hashim and Jones 2007),
Structuration Theory (Giddens 1984; Turner 1986) and Actor-Network Theory
(Latour 1986; Law 1986; Callon 1999).

Getting to grips with materiality

We look at and use gadgets on a daily basis but do we wonder how it really worked,
or why and how it came to be there in the first place? After completely disassembling
and laying out almost 400 components built from roughly 100 different materials,
Thomas Thwaites came to encounter the enormity of an endeavour to recreate a
mundane and common domestic technology, one often seen in modern western
homes – an electric toaster. His story is a classic tale which reiterates that just
because something is obvious, this does not make it knowable.
“…157 parts, but these parts are made of sub-parts, which are themselves
made of sub-sub-parts. Does the variable resistor that controls the toasting
time count as a single part? But it’s made of eight sub-parts, so perhaps it
should count as eight? Does a capacitor count as one part or eight?” (Twaites,
2011, p 18).198

198
Thwaites, T. (2011) The toaster project: or a heroic attempt to build a simple electric appliance
from scratch Princeton Architectural Press, page 18).

026463 186
Bewildered by the rising complexity of the task ahead, Thwaites decides to take a
more modest approach to the project. He would now recreate “the bare minimum
from which I think I can make a toaster that retains the essence of toasterness. These
are: steel, mica, copper, plastic, and nickel.” Taking a similar reductionist position as
saying that we are all derived from hydrogen atoms and star dust, we begin to realise
that even familiar, relatively cheap electrical devices, those which populate our
everyday space, are actually the product of quite highly developed legacies of
knowledge, sophisticated techniques and considerable inventories. One wonders how
many budding Chinese industrial entrepreneurs have stood in Thwaites shoes before
working out and committing resources to recreate the toaster at a better price.

Even after deciding to recreate only 5 of the nearly 100 materials involved, Thwaites
still faced an insurmountable challenge. While he had easily modifies his idea and
strategy regarding what he has to do, that is, how to recreate the toaster, he still had
no idea where to begin. Where would he obtain the materials, or even where was best
to start looking? He experiences disconnect between his motivations and intentions
and his desire to be able to implement them, he lacks information and ‘know-where’.
His initial, simple question has evolved into an even further exposition of how
implausible it is to recreate the mundane, let alone do it by yourself. He nevertheless
doggedly follows his mission which saw him getting tips on where to go, and
traveling the entire UK visiting abandoned mines and actually digging up raw
materials. At one point, he clumsily attempts to smelt iron ore in his home
microwave. Through sheer determination and the desire to create a book, he finally
manages to create a Frankenstein monster which very vaguely represents the original
toaster. The beast operates for only a very brief moment before the mains electric
running through unsheathed copper wires melts and wrecks the facsimile. He
realised: “The point at which it stopped being possible for us to make the things that
surround us is long past.” This is the stuff of those who would muse on time travel,
and wonder what would happen if we took modern knowledge back in time with us.
Would we be able to articulate our ideas, and build demonstrable modern technology
from scratch? Would we be like Twaites? What would it be like to recreate the
internet? Even if we built the toaster, what would we use for power?

026463 187
Raymond Burke used a similar rhetorical sketch where the power is cut to open up
his epic series on the evolution of technology - Connections. Burke wished to jar us
from modern states of comfort and complacency when he opens his series with a
dramatic recreation of the infamous power black-out in New York City of 1972. His
use of this event was not only to show that much more than toasters would fail to
operate, but all the multitudinous electric devices and systems that power an entire
city would be rendered ineffective or useless. A breakdown of this magnitude is
significant as a social event. A breakdown of all that we take for granted at this scale
opens the conditions for social chaos and confusion. However, Burke’s skit was
simply a prelude to a much larger concern. If electricity did not come back on what
would we do, where would we go? A city very quickly loses its allure without
power, and so the captivating programme steps us through the rational technical
options regarding how we would have to repair to the countryside and tackle growing
our own food again, beginning with us finding or making a plough, and basically
trying to re-establish what we have either forgotten, or ‘moved on’ from as a society
or culture. This would be necessary for survival and sustenance in a post-electric
world, when we ran out of all those low hanging fruits, the packed and tinned goods
which can be forged or found. He successfully illustrates the modernism is
something of a veneer, or even paraphrases Bruno Latour when he retorts ‘we have
never been modern’ or that of the Situtionists when they wrote the slogan ‘Sous les
pavés, la plage !’ – under the pavements, the beach!.

For some time, one would imagine that any steps taken by people would likely
represent the initial and vague attempts at trying to recreate the homeostasis of the
everyday we all know, the manner to which we have become accustomed, that which
we have come to expect as the goings on of everyday existence. Schank (1977) and
the notion of scripts show that people develop routinised, behaviour which dominates
the performance of actors in these scripts and can also be accounted for our ability to
understand events that take place within the context of these scripts. Moreover, in
Schank (1982)199, what is referred to here is scripts. These are structures that
199
Schank, R. 1982. Dynamic Memory: A Theory of Learning in Computers and People. Cambridge,
England: Cambridge University Press.

026463 188
encompassed similar aspects and that the independence of these structures in
memory would not account for the fact that learning could take place across such
structures. This is the conservative streak in human nature. And such proposals of a
return to the land and so forth, is the scaremongering aspect of global warming and
ecological crisis stories, the various cinematic depictions of post-nuclear holocausts
or the aftermath of raging global pandemics, asteroid strikes, the dreams or
nightmares of survivalists in their underground bunkers counting their seeds. The
point these dystopias impress is that we do not have to move back so far in time to an
age where organic, hand-built, and human scale constituted the everyday, the norm
that we encountered.

Standard of living, quality of life

This brings us to a key question, again one which is prolific in discussion of progress
and technology, and of course to economic development. It concerns other
amorphous values, which are commonly referred to in discussions as wide raging as
education and health, but also includes employment, environment, and a raft of
cultural considerations. The term quality of life is commonly used to evaluate the
general well-being of individuals and societies. It is a qualitative measure in that it is
not easily measured and many of the factors affecting it are tied in with the United
Nations Universal human rights such as the right to freedom, the right to marry and
freedom from discrimination. None of these things are economic. Standard
indicators of the quality of life include not only wealth and employment, but also the
built environment, physical and mental health, education, recreation and leisure time,
and social belonging. Conversely, standard of living is based primarily on income
and what that level of income will allow a person to buy in the way of necessities and
luxury goods. Standard of living refers to the level of wealth, comfort, material
goods and necessities available to a certain group of people in a certain geographic
area. We must not forget that the standard of living will vary greatly from country to
country and what one level of wealth will buy in one country may not necessarily
buy in another. For example, you can buy far more with your money, and live
comfortably for longer in India than the same amount of money would allow in the

026463 189
UK. In short, what people consider a necessity does not just depend on what is
available – it can depend on what is considered or viewed as normal. This fits in with
the findings of ‘happiness economists’, who have noted that people’s happiness often
depends less on how much money they have than on how much they have compared
to others. It is easy to be satisfied with a small house when all your friends live in
apartments, but if they all live in large houses; a large house comes to look like the
norm – or even a necessity. Many economists agree that the quality of life for most
people for instance in the Britain, only began to improve markedly during
industrialisation and the concomitant rise of the urban areas. Both salaries and
availability of goods constitute the rise in living standards, and new norms arose such
as living in regularised terraced and eventually suburban housing.

Abraham Maslow (1954) suggested that the basic biological need for food and water,
once satisfied, gives way to ‘higher’ needs including social needs. 200 First there are
needs to belong to a family, group, community, or nation, and secondly, there are
needs to be rise up and be admired within these social groups. Maslow’s idea is that
as one level of the hierarchy is satisfied then the next comes to focus and causes
tensions and concern, and the next until one reaches what he terms as self-
actualisation. Once the social needs are satisfied higher personal and intrinsic needs
then come to the fore, culminating in self-actualisation – a state where one is
everything one can be and one knows it, and can even let other people into the secret.
Maslow, a humanistic psychologist, put forward the idea that most humans are not
simply blindly reacting to situations, as was popularised by the behavioural
perspective, but continually aspiring to accomplish something greater that is they are
teleological.

The economic historian Gregory Clark and others have offered a description of life
before the industrial era which is very basic indeed, certainly so if you compare with

200
Maslow (1954) attempted to synthesize a large body of research related to human motivation. In an
ranked order of priority they included the needs to address: 1) Physiological: hunger, thirst, bodily
comforts, etc. 2) Safety/security: out of danger. 3) Belongingess and Love: affiliate with others, be
accepted. 4) Esteem: to achieve, be competent, gain approval and recognition. 5) Cognitive: to know,
to understand, and explore. 6) Aesthetic: symmetry, order, and beauty. 7) Self-actualisation: to find
self-fulfilment and realise one's potential. 8) Transcendence: to help others find self-fulfilment and
realise their potential.

026463 190
average living conditions in developed economies today (Clark, 2014, 2010, 2007) 201.
Most notably Clark tells us that standards of living have been consistently basic in
terms of comforts for a very long time. If you consider that the average person's life
in the Stone Age was no worse that of the average 17th century pre-industrial age
Briton then a very different vision of history comes to the fore than populates our
mental images of the 17th century. It may be typified by rural idylls such as those
captured in paintings of the time, depicted in novels or in modern movies. 202 Indeed,
to modern global metropolitan citizens, typically highly educated to a higher and
earning more than their provincial counterparts, a life where one was largely
forbidden to move location, where one and one’s family would live on subsistence
wages, and where one would have little communication outside of their everyday
experiences whilst being surrounded by persons suffering the same fate would be a
very foreign and restricting existence indeed. In the 17th and 18th century Scotland
coal miners and their families were still treated as serfs, bound by law to the colliery
in which they worked and to the service of its owner. 203 Anybody that the owner or
his men found wandering about could be apprehended and put to work by law. 204 The
historian Rosalind Mitchison (2000) lends us a riveting account of if someone of this
time was poor, aged, orphaned or ill. She details how the old Scottish system of
‘poor relief’ which fell under legislation originating in 1574 which dictated who
would receive and who would not. This legislation remained in place until reforms in
1845, and it was not until the National Assistance Act of 1948 that the last vestiges

201
Clark, G. (2014) “The Industrial Revolution” in Handbook of Economic Growth, Volume 2 (eds.
Philippe Aghion and Steven Durlauf), The Consumer Revolution: Turning Point in Human History, or
Statistical Artifact? (2010)
202
But given their more varied nutrition and lesser workload, Clark notes that the lives of hunter
gatherers has been considered superior to both see; Clark, G. (2009) A Farewell to Alms: A Brief
Economic History of the World Princeton University Press
203
A 19th century history of Arbroath details The fishermen were serfs to the Abbot of Aberbrothock,
providing regular fish for the monks. After the reformation the lands of Ethie were passed onto a lay
proprietor and the fishermen continued to be bound to the owner of the estate. There was unrest in the
village by the late 17th century when some fishermen burnt the houses down. In 1705 some decided to
move to Arbroath with it’s better harbour facilities and market. Arbroath Town Council offered
incentives to attract fishers to settle there. But the Lord of North Esk, who then owned Auchmithie,
did not take kindly to his fishers moving. He took his case to court and it was ruled that the fisher folk
were still serfs and not free to move as they pleased. They were forced to return to Auchmithie.
204
This bondage was set into law by an Act of Parliament in 1606, which ordained that "no person
should fee, hire or conduce and salters, colliers or coal bearers without a written authority from the
master whom they had last served … the said estates of this present parliament give power and
commission to all masters and owners of coal pits and pans, to apprehend all vagabonds and sturdy
beggars to be put to labour.

026463 191
of the Poor Law disappeared. The decision was dependent upon two groups of men
who usually made decisions regarding the poor in the parishes: the kirk session, and
the local landowners, primarily the lairds, who often held joint roles as both elders
within the church and, as heritors, men of property who could be taxed' to provide
for the poor.

The church wanted to provide some form of Christian charity for the poor, tied of
course, to godly discipline and moral reform of their lives. Later laws were passed
restricting who could move where. For instance, the Synod of Angus and Mearns
passed an act in 1741 restricting the poor to their own parishes. Those not belonging
to the parish were to be sent back to their place of birth and might be put in the
stocks or otherwise punished. Each bone fide local who was poor was supplied with
a badge which allowed them to travel the parish, but not beyond. In effect, these
arrangements gave extremely inadequate relief to the poor and Mitchison cites a
particular instance when a parish stripped a five-year-old orphan of relief, telling her
to take the roads to beg or work. If one did manage to leave the place in which one
was born, it is also of note that about half of the white immigrants to the American
colonies in the 17th and 18th centuries were indentured labour – that is, they would
have to work like slaves for a number of years to pay their passage and keep. During
the late 17th and early 18th centuries, children from England and France were
kidnapped and sold into indentured labour in the Caribbean for a minimum of five
years, but usually their contracts were bought and sold repeatedly and some labourers
never obtained their freedom.205 Even today, we do not have to travel far to witness
where it still today constitutes an everyday struggle, to developing economies or
troubled war zones, which remain plagued with food and water supply issues,
electricity, and shelter, or the scourge of human trafficking and even slavery.

Which innovations are prominent and which are most relevant?

When it comes down to which innovations are prominent or important, or


consideration of those which have had the most impact on human affairs, there is
205
Christopher Tomlins, "Reconsidering Indentured Servitude: European Migration and the Early
American Labor Force, 1600–1775," Labor History (2001) 42#1 pp 5–43

026463 192
some confusion. Those which are newsworthy, those with the ‘larger voices’ are
those which are assumed to be most important. David Edgerton is one commentator
who has forthrightly challenged this idea and offers sobering thoughts on the subject
of ‘high-tech’. He offers a contest between the rickshaw and the jumbo jet. The
jumbo jet, or the car, does not eradicate the need for rickshaws. They are all
significant, they each have their place, just as radio was not replaced by the
television, the television by the internet, any more than cinema was completely
replaced by video cassettes, and now by downloading and streaming. The experience
that they offer, and what they can actually accomplish in terms of moving over
distance of course significantly varies. I can remember having the time and desire to
go through the streets of Phnom Pehn at the ‘charming’ human pace and comfort,
even vantage of the rickshaw, opposed to a motorbike or car taxi which offer quite
different views and paces of the city.

Edgerton notes the changes in “History books are derived largely from propaganda
about technology. So in the 40s everyone was excited about supersonic flight and
atomic power, and in today's history books we continue to think of that era being
dominated by those technologies. It wasn't. One might more correctly think of the
40s as a time of tanks, aeroplanes, cars, coal and wheat and pig farming. We inhabit
a world where what I call 'the futurism of the past' falsely conditions our conception
of the past."206 So historical perceptions, views, images of technology then are similar
to the ‘idyllic’ view we may have of 17 th century rural life. The point being that
‘scenic’ contexts and views are not at all only used in design, but contribute to our
overall worldview which conditions our life and perceptions.

The Cambridge trial of interactive television

The equivalent of supersonic flight or atomic power in the mid-1990s was heralded
in a significant technology and marketing trial of digital networks and multimedia
technology took place in Cambridge, UK. It took place against a background where
television was firmly established as the Trojan horse of commerce, information,

206
https://www.theguardian.com/technology/2006/aug/01/news.g2

026463 193
news and entertainment within the home.

Television, while not being a toaster, is for many, many people emblematic of the
everyday (i.e. Silverstone, 1997). It is the technology norm and provides the norms,
as we encounter its messages daily. More than a toaster, falling under the loose
category as a media and communications technology, television is endorsed with an
almost mystical presence. It has enormous power and reach as it invades the modern
and modernising world, and with its inception lost and eclipsed in the pre-WWII
period, it reaches its apotheosis in the post-war boom of the 1950s, especially in the
U.S. and U.K. before taking the world. It is endorsed with additional powers to
reinforce and create the meaning of, and meanings within, everyday life. Its ubiquity
and ordinariness in homes and lives belies its power to guide, shape and export rules,
codes, customs, models. Like so many taken for granted phenomena, it only when it
comes under the scope of the researcher or designer that it is defamiliarised, its more
objective, insidious even unintended, functions, purposes and consequences jump to
the fore. (Bell, Genevieve et al. 2005) 207 Internationally established artists like Nam
June Paik and Bill Viola also use sound and TV imagery to construct a total
environment which showcases and celebrates its place within our lives. On the
forefront such musings bring to bear some of those of earlier media theorists or
visionaries, most notably Marshal McLuhan, who amongst many more musings
viewed that the electronic and televisual extended the human nervous system to far
reaches way beyond the body and sensory apparatus. “Man in the electronic age has
no environment except the globe and no possible occupation except information
gathering.” With respect to studying the medium, what it is and what it does he was
pessimistic “Since it [television]has affected the totality of our lives,...it would be
quite unrealistic to attempt systematic or visual presentation of such influence”
(McLuhan, 1964, p. 317). Instead, what seemed a more feasible way of studying
these effects was suggested—“to present TV as a complex gestalt of data gathered
almost at random” (McLuhan, ibid).

Slightly less histrionic, Roger Silverstone viewed that everyday life was synonymous
207
Bell, Genevieve et al. “Making by making strange: Defamiliarization and the design of domestic
technologies.” ACM Trans. Comput.-Hum. Interact. 12 (2005): 149-173.

026463 194
with television: "Television is everyday life. To study one is to study the other."
(1989: p.77) We can certainly imagine life, and especially before the advent of catch-
up television and video recorders, being lived around the temporality of television
programme favourites. “Sorry I won’t be able to meet tonight, I simply must watch
Coronation Street and their preparations for the next big wedding…” Silverstone
eloquently suggests to us that through the study of this particular technology, not
only do we learn of the human condition beyond our homes and families, but make
sense of our homes and families through its instruction. And not only this, but to
some extent we learn of ourselves and build and reinforce our own value systems.
Broadcast media became a clock. People ordered their social lives by the schedules,
certain programmes tailored for specific audiences came on at particular times so that
one could synchronise when for instance, mother would prepare the family dinner,
while the children watched dedicated children’s programmes. And it was not just the
day or week, seasons of course ran and were lent character and meaning by the
television. Television was then also a calendar. Regional news programmes and
features accent programming of a more localised nature, reflecting life at home rather
than some homogenised world of sameness. Stephen Heath viewed that media, and
in particular, television, forms a; "seamless equivalence with social life."(1990:
p.267) Chat and comparisons regarding programme contents and narratives spill over
into lived experiences manifesting in discussions between viewers, what is termed
‘fandom’, an important extension of the viewing experience and to some extent
providing a social glue, a cue for social interaction and sub-cultural value building, a
participatory culture every bit as relevant as teenage music fans wanting to dress in
the garb of their heroes and hang out together listening and sharing their decoding of
lyrics (i.e. see Duffet, 2013)208. People came to have television on even in the
background as it lent a friendly atmosphere to an otherwise silent lonely home. The
recent spate of reality television series, which feature ‘real people’ opposed to actors
trapped in a house or on an Island, is a strong example of how viewers are
perpetually asked to separate ‘good’ from ‘bad’ characters and personalities. They
can differentiate, make decisions and judge between persons before being presented
with the opportunity to vote on ‘who they like best’. It opens the prospect of
208
Duffett, M. (2013) Understanding Fandom: An Introduction to the Study of Media Fan Culture.
New York: Bloomsbury Academic

026463 195
Machiavellism gone mad on behalf of participants of such shows who of course,
reflexively know beforehand what goes on when people watch the show. Television
is a machine for social learning.

Silverstone, and institution of reality Television shows and others such as those
which feature curated You Tube videos, echoed the French social theorist Jean
Baudrillard who in an earlier thesis went even further when he posited television as a
two way process of dissolution: " . . of TV into life, . . . . of life into TV." (1983:
p.55) The modern world, the modern mind, is a juxtaposition of mediated experience
drawn from internet and television and mixes this with lived social reality, on and off
screen. Unless one never possessed a television, was never exposed to its messages
and maxims, it would be difficult if not impossible to pick out which values were
drawn from its repertoires and curricula, and which were honed by direct social
interaction with others in everyday life, education, and social interaction. How much
would we have known of history, or geography, of exotic plants, distant wars, black
holes, the evolution of trombones, the Basotho tribe of Southern Africa, virtual
reality, democracy, questions regarding the purity of our water, the value of the
pound against the dollar, or the views of the Prime Minister without it? We must be
informed in order to make good and proper decisions, if it is not the television, then
it is the newspaper or book. If it is neither of these, then it is hearsay or imagination.

But whether or not such claims are mere views or hold out in rigorous social
research, whether or not they can be ‘proven’, does not belie the dearth of social
surveys which illustrate the shear amount of time which is spent by the average
person viewing television on a per day basis, even in this time when internet use and
access has displaced some this. In terms of pure exposure it is self-evident that
television, its possible sand actual impact on what we think about, and perhaps more
importantly, how we think about it, is considerable. So any aim to make television
interactive by incorporating digital technology and networking it to a system, adding
functions and choices, any change to the modes and periodicities of its operation,
would conceivably alter the nature of everyday life and activity, or at the very least
the television viewing/using experience, which makes up a significant portion of it.

026463 196
As a design project, as a design problem, the primacy of television, its ‘double
articulation’ as Silverstone and Raymond Williams puts it, poses its own very unique
set of imperatives to the social researcher. This relates to the notion of television’s
existence as a device, a box, a configuration of electronic components, as a
technology and its existence as a medium, a conduit of information, entertainment,
values, beliefs, opinions, styles etc. (Williams, 1974; Silverstone, 1994). Consider
the following which is lifted from the published work detailing a design project
which was looked at the phenomenon of television square in the eye with a view of
innovating upon its features, functions and role and place in everyday life:
"More than any other product, television follows, interprets, and modifies the
relationship that the individual maintains with the spaces and rhythms of life.
At the same time, television possesses the magical capacity to become a
mirror not only of contemporary everyday life but also of the profound
transformation it is undergoing. It is at once object and 'meta-object'. While
television features as an object in our everyday existence, that same existence
is reflected back at us in the subject-matter of the medium. To try to
understand the role, which this dual nature of television plays in our lives, we
need to explore the interaction between technological and social
developments and their joint impact on the individual." (Morace, 1995: p.13)

When we do consider more deeply what television offers, or come to that any
broadcasting medium, we begin to appreciate at least the melding of ‘content’ to
‘medium’. Both appear critical to the success of the medium, the device. The object
of the receiver, or box, though is pivotal but confined and restricted, focussed. It is to
convey as faithfully and as best as possible that which is broadcast. The object of that
which is broadcast however, is much more morphic, less defined, more open and
fluid, - it is to inform, entertain, and otherwise amuse - to pose, compel, showcase,
evoke, and all other performance metrics. What entertains one person, as a glib
example say a five year old child, may hardly entertain a highly educated adult who
boasts refined taste in high art. They may prefer an opera or a gardening feature.
What is performed or delivered is down to production values, editorial control,
artistic choice and so forth. So there are a desperate range of very sophisticated
protocols and mechanism, editing and selecting which has shaped what comes to be
broadcast. Of course the opposite is also true. Viewers’ videos also feature curated
and collated so as to showcase everyday life and its sometimes bizarre occurrences.

026463 197
There are even shows which curate and collect material from You Tube for
broadcast, perhaps for those too lazy to browse or lack the ability to curate or collate
themselves.

We can witness that, as in other ‘standard wars’ scenarios that available software,
and I use the term in its loosest sense to refer to any type of content material –
games, recorded music and video, is what defines success. Betamax, which famously
had superior technical quality, gave way to VHS which was cited as ‘winning’ the
standard wars by offering a wider range of programme material (some also say also
pornography and the cheaper video cameras). In this sense, then, from the uptake or
user perspective is an argument that the possible experiential prospects trump the
technical quality. In a sense this is like the tendency in human perceptual psychology
to ‘fill in the banks’ or fill in the spaces’ – low resolution games can still pose
problems and be fun to play, and in fact be preferential in certain cases to hi-
resolution.

Making television interactive

This event and the publicity arising from the advent of the interactive television
project, the buzz if you will of the press realises and reports gave rise to interest from
a plethora of diverse media and cross-sectorial industry players regarding what
would happen if television were able to be made interactive by digital means. Each
had a series of quite channelled visions and predications. They were, amongst others,
technology developers, many of which were related to or were already doing
business with the system originators, cable and telephone network operators, media,
advertising, banking, and retail companies. Each reflected upon their own particular
business problems and philosophies while facing and attempting to establish the
potentials of such a new system. Not only would the technology serve or threaten
existing business models or operations, what they did and how they did it, even
where they did it, but what exactly were the benefits to them and, by extension, the
benefits which could be packaged and passed onto their customers, funders and
clients?

026463 198
Most critical was the question: How would ‘viewers’ respond to this new form of
television experience, an experience where they would essentially become to some
extent active participating ‘users’ of television, rather than passive ‘viewers’? We
must remember that people had already become very accustomed to interacting with
television, having grown with its presence in the home since they were infants. Most
people now had experience of interacting with television via add-on devices such as
video games consoles, and video recorders and players. Some control over what they
viewed was afforded by these technologies, and part of the ‘mystery’ if you want of
television production and the televisual experience was being revealed by the
diffusion of hand held video cameras.
1993 should be marked not only as the landmark year in which the first browser was
realised, but also the release of Doom is considered to be one of the pioneering first-
person shooter games, introducing to IBM-compatible computers features such as 3D
graphics, third dimension spatiality, networked multiplayer gameplay, and support
for player-created modifications with the Doom WAD format.

All of these technologies served also as part of the an industry attitude which was
brewing at the time that big changes were about to happen, and clearly those firms
who had their finger on the pulse of such change would have the early advantage in
the market.

Learning and the sand-box in the 'service nursery'

The firms wondered what would they need to be able to know and do to use the
technology and system, how would they make their products and services virtual?
How much of a graphical and functional, technical or programming problem would
this be? Just what would be possible to do now and the years to follow? Even if the
internet came to dominate, as it now does with vastly increased bandwidth and
compression and related technologies, it would surely boast similar services. How
might these organisations interrelate and link to all others who used the system?
What slack was there to engage in collaborative and mutually beneficial ways with

026463 199
users and partners? The technology and marketing trial was aimed at lending insight
to these questions, or even producing others in their wake. They felt compelled to get
involved. The overall aim for participants was"... learning about rules and systems to
learning about exploiting rules and systems. "(Aldrich, 2005, p.30) 209 What could be
construed as ‘rules and systems’ would include learning something of path-
dependencies in the technology so as to make adequate predications of what was to
come..

Led by the original technology developer these organisations came to prototype a


media delivery system, which included hardware, software, interfaces and which
would feature the types of new services and content which would provide an end-to-
end system for interactive television (i-Tv). In some sense it is relatively easy now
to imagine what needed to be done, but at this time, they were as in the opening skit
of Thwaite building his toaster, there were no analogies, no blueprints, not
guidebooks of any kind to define the rules of the new territory from scratch. It would
have to be built as it went along, through negotiating and building ideas, and as a
group effort.

Issues, complications and opportunities arising from the approach and


implementation of the trial unfolded over an eventful and highly pressurised three
year period. Monitoring this project over its entire duration provided not only a rare
research opportunity, but tremendous privileged insight not only the modern
innovation and development process itself contextualised within an rapidly emerging
and evolving industry sector, but how it highlighted and showcased the wider issues
of how designers come under a wide range of shaping influences beyond the choice
and application of components.210 And especially so in the early stages of innovation,

209
Aldrich, C. (2005). Learning by doing: A comprehensive guide to simulations, computer games,
and pedagogy in e-learning and other educational experiences. San Francisco, CA: Pfeiffer.
210
It is not often that research allows such a complete snapshot of a project from ‘cradle to grave’.
More often a research is based upon snippets, even desperately grasped visits, composited together to
suggest a penetrating whole. This has sometimes been the criticism of the case study against other
forms of social investigation such as the large scale survey, or the longitudinal study. As shall be
discussed later in a chapter devoted to methodology media ethnography was often criticized in lacking
the ‘staying power’ or the traditional ethnographies of anthropologists in their trips to ethnic
populations where they may stay and study the everyday life of indigenous inhabitants for months or
even years to make their commentaries.

026463 200
where they and their partners were grasping for meaning and direction. Issues arising
from this dilemma challenged the research approach and implementation itself, as
did the politics of the situation, and drove it from a relatively straightforward and
safe attempt to explore the usability – or ease of use – of the interactive television
system, to a much wider exploration and consideration of how ease of use as practice
and as experience situates within the exigencies of system building and prototyping,
with all the uncertainties and risks therein. As such it was an attempt to possibly
understanding a deeper or wider meaning of usability, one which includes the
sociology of design, and experiential aspects of media technology. It is also serves as
a litany of risk, inter-firm politics and posturing, and how they not only impinge
upon research into firms and innovation under concerns of confidentiality, but also
upon strategic management and direction as opportunities wax and wane, emerge and
recede.

Various senior managers and designers were interviewed within the company
designing and producing the i-Tv technologies and interfaces, as were several of the
partner companies responsible for content and services. In particular the focus was
upon a technology and marketing trial of the entire technology and services system.
Included was the study of 11 trial participant households in an attempt to explore not
only design and production, but also consumption or participation and use. The study
concludes with an overview suggesting the rich interconnection and interdependent
nature of trials, which are configurations of technology, user-consumer research,
knowledge production, design, media, designers and management of organisation.
Finally the study uses CU in relation to Alfonso Molina's notion of Sociotechnical
Constituencies to illustrate how wider and more disparate social, cultural and
organisational elements of trials both rely and impinge upon the implementation and
interpretation of user and consumer research, and thus operational perception of the
user and the use process that guide technology and service development.

1990s the rise of the digital

Only now a postesori are we fully cogniscient of the fact that the early to mid-1990s

026463 201
represented a crucial time in the development of digital networks. And not only the
technology, but the creative development of online content and services, new
business models, and new strategic and design thinking. It was only then that the
internet was beginning to diffuse into private homes in any real scope and scale, and
there to be used by members of the general public and across varying age groups. It
was only then that commercial companies were being allowed to exploit the internet.

Commercial firms, investors and other key players such as government regulators,
began to recognise and see the vast potentials that such a project entailed to their
operations, their business and remits, but they lacked concrete experience and
understanding of just what the technology could do, and more precisely what the
technology could do for them. They needed to get involved to understand what was
going on. Expectations were fashioned along the lines of what had transpired in past,
in the institution of other networked utilities such as telephone and broadcasting. It
was enshrined in their historical accounts (such as those offered by Asa Briggs in his
excellent histories of the inception of the BBC). These stories told us that, that it was
not the developers of the technology per se who drove success of mass media
products (i.e. Baird, or Marconi-EMI), but rather it was the service providers such as
the BBC through the shaping and development of programming content, services and
production technology, which would draw and intrigue consumers and subscribers.
In that sense they would develop uses and purposes, the motivations for using the
device, the receiver or box in the first place. This is what had happened at the onset
of radio and television.211 What is striking regarding the growth and diffusion of the
other major ICT, the telephone network, was the contrast in terms of its uses and
content from broadcasting. Early attempts to use it and sell it as a broadcast system
(i.e. distributing ‘programme’ material such as music, news etc.), quickly gave way
to a more focussed use as a system typified by 100% user created content (ucc) i.e.
subscribers talking to each other via the system which connects them typically on a
one-to-one basis discussing subjects of their own interest. Clearly the printing press,
the phonograph and the recording of speech and music, and film, also played a role
211
Although Shaun Moores has shown that the technology in itself also had attractions for enthusiast
radio hams and builders. This is similar to the computer brew club phenomena from which Apple
originated, and captured in Leslie Haddon’s unpublished thesis where he looked at hobbyists and their
importance to early personal computing.

026463 202
in terms of laying down the principles of ‘canned’ as opposed to ‘live’ performance,
and of course a business model which prevails regarding producing material which
could be copied, stored until it sold.

The new digital systems would be a kind of a mix of all these modes, digitisation and
storage carried the projects of very large media depositories of material that could be
accessed, and communication between themselves and with firms and organisations
providing content and services, in obvious (including order, surveys and feedback
forms) and less obvious manners (system-logging of usage data - the opportunity to
monitor participants use of the system through online system-logging of button
pressing and menu selections)212. Other quite novel faculties peculiar to the new
online digital world would also feature and included playing online games with
unknown system users this was only being realised in the networked PC world.

There was also a reasonable expectation that the content and services would drive
usage of the system, and new kinds of revenue models would also have to emerge,
including paying for connectivity to the system in much the same way that homes
paid for electricity, water and sewerage. This was already happening with dial-up
connections. Subscribers could also pay for content or bytes downloaded, which had
also fuelled the business model right back to the view data services such as Prestel in
the UK in the late 1970s early 1980s [which actually turned out not to invoke much
user interest after one had tried it out a few times – i.e. unintelligible blocky
pixelated weather maps etc.]. There was also the added dimension of online payment
and online credit card processing. Furthermore, there were also visions of how this
new digital world would impact all manners of practice in the existing commercial
and physical world, as it now has. This virtualisation of cash and the removal of need
to travel to locations to shop included imagined modification of supermarkets from
places that people travel to get their groceries, to depots from which goods are
delivered to house from. Of course, this and other visions, still unfold today with
some more realised than others (i.e. Amazon’s cavernous warehouse distribution

212
“…design is not and cannot be data-driven. While 'the way the world is' enters design decision-
making at many crucial points and many different guises, its inclusion is only rarely in the form of
enumeration of particulars””Sharrock and Anderson (1994; p.8)”

026463 203
centres, and online shopping with delivery]. The economic consequences then,
methods of subscribing and paying for services and content, were also significant.

As an example the research gave rise to the development of a general research


framework - Contextual Usability (CU) whose central aim was to draw attention
away from the idea of a crystallised, finished product to map the complex
dimensions of the human use process and its influences as socio-cultural,
organisational and social constructions. It also focussed on the phenomenal and
experiential aspects of the system and its use and compares this to the design and
strategic visions. Such a view has since gathered much more momentum in studies of
human-computer interaction but was not widely understood or practiced at this time.

Usability – an experience and its testing as a practice

At its core, the dominant research paradigm in understanding the usability of devices
and interfaces during this period was largely derived from lab-based experimental
studies where users were figured as ‘test-subjects’. Usability testing as an industrial
practice, an emerging part of the new accelerated product development process, was
only beginning to gain wider acceptance in technology firms, being mostly
spearheaded by the outstandingly innovative firms in Silicon Valley and included
people such as Xerox Parc and Apple. Even within these innovative companies
usability was championed by particular individuals. One of the most notable of these
would be Donald Norman. Early on Norman drew district attention to the financial
and political ramifications which would oppose the development of usability as a
practice and essential part of the product development process. In a production
climate which persists to this day, onus was being increasingly placed upon fast
development of new products from design to manufacture. Usability tests, while
gaining traction as a guarantee and safeguard regarding product quality and
functionality (i.e. thus preventing expensive product recalls), was criticised by others
in the firm as a further loop, even a hindrance, in the development chain. It was seen
as burdening efforts to become more ‘lean’, ‘agile’ and ‘responsive’. Thus there was
a tension set between design and design testing and manufacture which still persist

026463 204
today. Norman is also notable for his contribution to the usability of everyday
encounters with the designed and built environment. His book The Psychology of
Everyday Things (1988) is widely regarded as a classic in the phenomenology of
usability, and has Norman traipsing round the phenomenal world uncovering
differences between designs which poorly anticipate and accommodate for their
contexts of use, and those that are better. It is important in the sense that he took a
much wider remit on the world of usability and the occurrences of use than simply
focusing upon workplace tasks, mouse use and graphical interfaces (as if this is all
we encounter on a daily basis – even though perhaps this is becoming more truer
today where computer addiction abounds!).

Internet-enabled personal computers which had multimedia capabilities were


restricted in the early to mid-1990s, not only in terms of the content which was
available and growing, but in terms how much data could be communicated through
the dial-up modem set-up. Given the bandwidth limitations of the dial-up connection,
it was believed that interactive services delivered to homes via cable networks would
be a more user satisfying technical option. They would enable bandwidth capable of
delivering video on demand - which was basically digitised televisual material, news
features, movies, and programmes - at the press of a button. This would be a much
more desirable and commercial option for households and families, not least of all
because it would ‘feel’ and ‘look’ and moreover ‘act’ like the product of the familiar
television and a video recorder set-up, than the product of a computer. It is fair to
say then, that they were innovating to make a functional digital facsimile to what
already existed in the analogue electronic world, and functionalities and features
would develop from there.

There were serious restrictions however regarding the cable based system. It meant
that coverage would only extend to the ‘cable islands’ – the urban areas where cable
was laid. Another restriction would lie in just what was made available online. The
aim was to create a carefully chosen and limited array of content and service
offerings which would constitute a ‘walled-garden’ compared with the internet,
whose open and unrestricted content, fuelled and contributed to potentially by any

026463 205
user, was beginning to be viewed as a moral and ethical threat. Internet subscribers
were already creating sites which featured hard core pornography and other non-
desirable material such as bomb-making. At the other extreme much other material
was of an overly-dry academic, ‘geeky’ nature, simply not of any interest to the
general public hungry for the likes of celebrity gossip. The ‘walled-garden’ would be
aimed at the provision of a raft of suitable programme options primarily aimed at
family use, and which would moderate access to content deemed of a morally
dubious nature. As such it continued the broadcasting tradition of programming
(such as eschewed by the BBC under its early guide from the likes of William Reith)
considered as a positive reflection of decent citizenship and domestic harmony.

However, the need for ‘popular’ programming encroached into the world of media
buying, copyrighting and licencing. It came to pass that primarily due to copyright
regulation and the relatively low numbers of trial participants there was little interest
expressed from film studios, television programme producers and their distributors to
lend material to the system. Those broadcasters who boasted massive archives such
as the BBC were not participating, happier to wait in the wings and see how the
digital and networked world would unfold rather than commit to a given solution.
Was this a reaction, again, derived by their own early beginnings, when they invested
in both the Baird and Marconi-EMI alternatives to analogue television broadcast,
only to find the electronic Marconi-EMI system the superior alternative?

Missing from the trial was the presence, and most important the offerings or
holdings, of the larger media companies and conglomerates, including the
international newspaper and printing conglomerate whose interaction with the
protagonist technology company in the study originally sparked off the project. Their
incentive to participate and get involved was limited, as the system would only serve
as a delivery system, distributing their products, i.e. a film, largely in wholes. There
was no need to explore and understand authoring or producing material for, and
through, the system. Also at this time there was reticence and even antagonism
towards the digital medium by many of these companies and their partners and
clients (i.e. video rental shops) for fear of the easy copying and distribution of

026463 206
digitised material. This of course was their nightmare visions of the file sharing
services which came to their zenith with Napster in 2001.

Much focus and indeed the original aim of the present study was to consider the
particular problems that lay in interfacing with what was essential a networked
computer now refigured as a television. First and foremost was that the orthodox
computer interface configuration which included graphical user interface (GUI),
mouse and keyboard, would not suffice this melding of melding computer with
television. A new means of operating and new tenets for interface design had to
accommodate the mode of use. Television were operated by remote controls, and
viewed at some distance from the screen. This would affect details such as font sizes
on pages and how buttons and switches were laid out on remotes controls, and how
they corresponded to on-screen data. This required some thinking through and lacked
any form of official guideline.

Usability testing as a practice in the early to mid-1990s focussed mainly upon the
experience of users in workplaces, such as offices and workshops, and their
multifarious individual and group experiences of using. This is because most
computer use and applications were relegated to offices and industry. So it focused
mainly upon computer use at desks and software which was influenced by work tasks
such as preparing and producing letters (i.e. Microsoft Word) spread sheets
(Microsoft Excel) and so forth. Those in industry who understood usability in this
way held that user experience design could be reduced to heuristics informed by a
plurality of epistemologies (which could be witnessed at the topics covered in yearly
events and conferences such as CHI). But what was increasingly common in much of
this research was the manner in which they eschewed the idea of fixity in human-
interface analyses. That is, many studies saw users or at least the technology they
were using as dependent or independent variables or constants. The notion of
‘human’ and ‘computer’ or ‘machine’ was seen as binaries or somehow distinct, in a
similar manner as ‘worker’ and ‘task’ had been conceptualised in studies of work
related practices with a legacy going back as far as scientific management and the
work of Frederick Taylor. They would study the time taken by representative users to

026463 207
complete an operation or task, such as how long a group of secretaries took to format
a letter with a different font or something similar.

This underlying ethos tended towards optimisation and increased productivity by


changing, first the task itself, then the environment in which the task was done,
which was akin to the earlier work on scientific managed. By the 1940s work
optimisation began recognising and employing means to influence the mood and
feelings of the workforce, and today, most evolved thinkers and practitioners on the
subject of ‘interaction’ feel that there are better ways to present information and
allowing (workers to carry the analogy) or users to explore, adapt, modify and
otherwise live and work with it. This progression in the ontology of the workplace
and workforce optimisation, and how to ‘optimise’ and ‘improve’ performance and
production, now awkwardly extends into the home and the domestic and personal use
of digital devices in an effort to understand and accommodate for, manage and
extend, optimise and consolidate, a wide range of everyday activities which lab
based studies would render artificial and alien. But using a device for entertainment
is not using a device to draft a letter.

This progression in the ontology of human machine interaction, as will be the


undergirding argument presented here, is elusive, as each new level throws up its
own challenges and creates its own conditions which problematize human-
technology relations. To paraphrase the artist Jackson Pollock - "Technic is the
result of a need - new needs demand new technics - total control - denial of the
accident states of order - organic intensity - energy and motion made visible
memories arrested in space - human needs and motives – acceptance." In The
Engineering of Experience (2003,2005) Phoebe Sengers goes further and questions
whether the
analytic approach inherent in engineering can be applied to the design of enjoyable
products at all. She calls for a holistic approach to design. She rejects the task-based
approach to design that can be traced back to Taylor and the deskilling of work and
proposes instead a more culturally inspired multidisciplinary approach. She also
presents examples of products that fulfil the characteristics she is calling for. By

026463 208
It is in a sense self-evident that new conditions and constraints (i.e. economic, new
legal and regulatory frameworks, new fads and fashions), new opportunities
(technical and economic), new experiences will always lay foundation or serve as
departure point for a new set of needs and requirements, which in turn, then go on to
create ever new imperatives to innovate and so on ad infinitum.

As the system develops new markets are constantly being carved out of our needs
and wishes. For example, consider the multimillion pound industries which have
developed around commodities which are said to make us look thin or young, our
desire to play games, to experience nature or enjoy art. The very fact that we have
the 'leisure industry' and the 'entertainment industry' points to the fact that the
separation of work from leisure has left a void in our free hours: 'Thus filling time
away from the job also becomes dependent upon the market, which develops to an
enormous degree those passive amusements, entertainments, and spectacles that suit
the restricted circumstances of the city and are offered as substitutes for life itself'.

The retreat into the privatised world of the individual and the family is a pronounced
feature of life in the 1990s. Adopting particular lifestyles seems to offer the only real
chance of personal fulfilment. Hence the increasing fascination for TV programmes
and magazines about fashion, cooking, holidays and gardening and the boom in the
Do-It-Yourself market. The family and the home have become leisure activities in
and of themselves; they have also become subject to the priorities of the market. All
the commodities which could increase our free time simply reinforce the family as a
unit of consumption not an emotional haven: 'As the advances of modern household
and service industries lighten the family labour, they increase the futility of family
life; as they remove the burden of personal relations, they strip away its affections; as
they create an intricate social life, they rob it of every vestige of community and
leave in its place the cash nexus'.84

In addition, Meszaros describes how the retreat into private life simply increases the
power of capitalism over us: 'The cult of privacy and of individual autonomy thus

026463 209
fulfils the dual function of objectively protecting the established order against
challenge by the rabble, and subjectively providing a spurious fulfilment in an
escapist withdrawal to the isolated and powerless individual who is mystified by the
mechanisms of capitalist society which manipulate him'.85

Meszaros also makes the point that alienation has deprived us of our ability to have
genuinely human relationships. We are forced to seek compensation for the loss of
our humanity in the limited area of our privatised personal lives, yet this merely
reinforces our alienation from each other: 'To seek the remedy in autonomy is to be
on the wrong track. Our troubles are not due to a lack of autonomy but, on the
contrary, to a social structure - a mode of production - that forces on men a cult of it,
isolating them from each other'.86

It also remains palpable that characteristics of the use process and experience such as
usability, usefulness and the opportunities and motivations to use are emergent,
dynamic; they change over time depending upon the circumstances and situations of
use. This can be perhaps best realised by the fact that the usability needs of a novice
user deciding on a buying a new digital camera by playing with it in a shop, will
eventually give way to the more exacting requirements of the adept expert user, who
spends their entire weekend taking photographs. It is only tied by a set of
physiological and cognitive limits, human factors, motivation, or more generally the
‘human condition’, and so awakens another project of science and technology which
has been pursued through time, augmentation and extensions of the senses. This
brings us to augmented forms of reality, virtual worlds, and that of scientific imaging
including MRI, Hubble telescopes etc. which in the McLuhan sense ‘extend human
nervous systems’. So in contextual ideas of usability the focus is the particular
instances and situations of use, expertise and exposure and practice, and subjective
and intersubjective experiences of use and design, and ordering and configuration of
usage patterns and the building of value and usefulness regarding a product or
service.

026463 210
CU also calls for a refiguring of the idea of usability and the practice of usability
studies. It asks that we do not treat exploring what is usable as some sort of stand-
alone project and takes a more systemic and overarching analysis. Usability as an
experience is always situated, as is the user, and it depends, then, on a range of
individual and personal factors such as legacy exposure to new technologies,
previous experiences, including emotional experiences, of using similar or related
devices, the value and meaning of the device and the task at hand and so forth.

Usability testing as a practice within the firm generally, and within the particular
exigencies of a given project and technology, is also clearly situated and in some way
or manner, and so is in other words, particular. This needs to be accounted for, and
this is where a complex is then found. To suggest that there is one given way, a fool
proof set of research heuristics, a set ‘way of doing’ will often result in weak
arguments, or fail to be proven in any satisfactory manner. And so early in the study
presented here the objectives were enlarged to become concerned with wider
systemic effects upon the perceptions and relevance of usability placed against other
key use dimensions such as usefulness [i.e. how certain programs and application
become useful], usage [i.e. how patters of use shape and are shaped], and the
instances of use itself [i.e. the particular circumstances and occurrences of using].

Composite traces blur and fuzz the memories of specific episodes, but render
salient the overlapping, prototypical features of a set of exemplars: Each time
an event occurs in a different context (time, place, and so on) a new trace is
formed, but soon there are so many different contexts that none can
individually be retrieved. What is common among the several exemplars is
the knowledge, which we call abstract, but by default, by the massive
interference attached to any individual context. (Crowder 1993: IS 6)

Such event knowledge structures have been conceptualized as frames (Minsky,


1975), schemas (e.g., Ruinmelhart ~ Ortony,l977), and scripts (Schank ~ Abelson,
1977). Theories about event knowledge structures generally have as their premise
that people use the personal and/or vicarious experience of an event or events to
build theories about what has happened and/or what will happen if a similar event
were to occur in the future. Abelson (1976) suggested that one type of knowledge
structure, a script, would move, as similar events were encountered, through three

026463 211
levels of generalization: from episodic, to categorical, and then to hypothetical.

Influenced by science and technology studies it began to appeal for a consideration


of the social and symbolic influences shaping experiences or behaviours, of both
designers and producers, and of consumers and users [or in this case trialists]. It also
considered more the bigger picture which concerned those such as the economists
Marx and Schumpeter and many others - the rise of new forms of industry, media,
advertising and retailing which also have given rise to them through creating
contexts, features and functions, as well as drawing out characteristics and
suggesting attributes. As such it seeks to define usability’s place as an intrinsic
experiential dimension in the processes of domestication, integrating technologies
and new technology practices into real homes, lives and lifestyles. As even the UK
Department for Trade and Industry has emphasised: ‘people in their social context
rather than task-centric users should be considered a fundamental source of
innovation’ (Wakeford 2004).

Human-computer and interaction – another binary?

Binaries, dualisms and comparatives such as artificial-natural, private-public,


individual-society, human and technology or human-computer are taken for granted
in the manner we approach and analysis the world. But they have been forcibly
challenged by some writers in science and technology studies, perhaps by, most
vehemently, John Law and others of the actor-network school. This influential group
of thinkers infer how the social and material intertwine and co-shape each other often
in complex imbroglios, and whilst doing so, at different scopes and scales, beyond
any proper recognition of boundaries they produce outcomes (i.e. see also Castree
and MacMillan,). Included in this diatribe is Bruno Latour’s insistence that we do not
privilege the human actor over the innate object in our analysis of what is going on
when devices, scenarios and people do what they do. In a sense it does not matter
that a computer is not a human, and vice-versa, or that we consider the manner in
which they differ, but what should be of concern is the interaction and what derives

026463 212
of it, in terms of product, or influence, or change, or output. Do these ends or
outcomes concord with the aspirations of those who aim to steer projects towards
‘successes? Do these outcomes concord with the aspirations of those who aim to
make their lives easier, safer, more comfortable, entertained, controlled, disciplined
or otherwise enriched – or does it not?

The seminal work Autopoiesis and Cognition by Maturana and Varela was originally
aimed as an attmept to understand some of the enigmas in color vision, the two
researchers developed a definition of living systems demonstrated to be a necessary,
and for the biological world sufficient definition of life(interestingly S.A. Kaufman
in “Reinventing the Sacred” starts from first principles and arrives at a similar
definition for a minimal autonomous agent). Simply, an autopoetic system
structurally embodies a web of linked interactions capable of sustaining itself within
a boundary of its own creations which is thermodynamically open(to food and
energy), but operationally closed. That last bit means that the organism is only
disturbed by its environment, but the resulting actions are determined by its internal
structure alone, the equivalent is true for “ the environment” from the outward
perspective.This is a very complex area ( for deeper understanding see “ The tree of
Knowledge” by Maturana and Varela, “Mind in Life” by Evan Thompson, or A
Systems View of Life by F. Capra and P. Luisi) but a couple of key take home
messages are important here. Firstly, the process of life at all scales of organization is
fundamentally a process of knowing and being known...life is a cognitive process.
Secondly, the biosphere is a fractal tapestry of intertwined, and interacting nested
networks of networks of autopoietic systems over many orders of magnitude. To
navigate such a tangled web, all organisms, through recurrent interactions and mutual
structural coupling(systemic memory arising from contingency based history),
develop simplifying heuristics so that meaning (with respect to the organism’s
internal autopoiesis) can be obtained in real time. The idea is that autonomous
subjectivity(feelings, emotions, desires, intentions), through recurrent interaction,
leads to instincts. As humans evolved increasingly sophisticated patterning ability
leading to symbolic languages and birth of the metaphor, our meaning heuristics
could be directly passed on to younger generations, honed collectively by social

026463 213
groups, and themselves become subject to selective forces(at a higher scale of
organization). From this perspective, the author traces the evolution of major cultural
metaphors and resulting cosmologies that have shaped human history since the
agricultural revolution.

Actor-network theory may have had profound influence on many STS studies but
does not dominate as an approach. Other studies have been less radical or semiotic in
approach, perhaps best described as more traditional in their sociological and
economic analysis. Molina (2003) is one author who has also addressed the issue of
how intangible, soft, social elements and more tangible, hard, devices and
components, give rise to an entire constituency of interrelating events, negotiations,
alliances, disputes, correspondences and making over the course of a the project and
process of innovation is ‘technology-driven’ or it is ‘problem and/or opportunity-
driven’. The constituency building is fundamentally dominated by the pursuit of
success for the technology in the belief that it will indeed satisfy an explicit or latent
demand. The constituency building is fundamentally dominated by the identification
of a problem and/or opportunity that can be tackled by an existing or new
technology. The differences between constituency and actor-network is rather like
the differences between Chess and Von Neumann’s ‘real games’.

Some of these other approaches have been very broad and grand, quite happy to
delineate between terms, for instance where ‘science’ and ‘technology’ are used in
very generic abstract or policy level mannerisms. If we banned these words ‘science’
and ‘technology’ then I wonder what would we speak of in their place? These studies
may speak of issues such as the impact of national systems of innovation and the
like, how national governments win or fail, at creating support mechanisms and
policies for innovative activities and knowledge creation. At the more meso-level
some studies focus upon particular institutions and firms, many of them focussing
upon what and how they produce material and knowledge artefacts in their own
interest. Others still explore the very intimate relations between individual human
beings or families living in homes, their agency in fostering, creating and maintain

026463 214
natural and human-made environments and detail what they devise and make. This
last category can be much particularised in their focus, showing how particular
technologies situate within particular environments and within particular social
grouping and demographics. It is also the influence in part of the present study and
the focus on the televisual and making television interactive for consumer-users in
their private homes.

Various conceptions of the human condition

All these studies share the idea of human beings as possessing the agency to control
or influence fate and environment through devices and tools. In effect they consider
human beings as homo faber that kind of human that Henri Bergson referred to in his
Creative Evolution (1907). He defined intelligence there as the; "faculty to create
artificial objects, in particular tools to make tools, and to indefinitely variate its
makings." Under post-enlightenment ideologies, i.e. Marxist or Capitalist, the very
purpose of humankind is founded upon what he or she can make or produce, their
contribution to society [Karl Marx, Kenneth Oakley 1949, Max Frisch 1957, Hannah
Arendt]. This is a time where maker culture is resurging inviting people to empower
themselves through making and tinkering. Although we have opened the present
work with a story which challenges our ability to make all that surrounds us this does
not abate a growing movement bent on mass attempts to ‘hack’ everyday technology
and make either something new or something different with it. In a sense this view
of intelligence is now rife as ‘open source’ software and hardware diffuses into the
market, and there is a marked shift to participatory practices taken up and mimicked
by various enterprises and organisations across disparate industry sectors and even
public institutions and government. Elizabeth Sanders notes, that in participatory
practice, ‘the roles of the designer and the researcher blur and the user becomes a
critical component of the process”. This has the potential to render the ‘deliverables’
of design ‘more meaningful to the people who will ultimately benefit from them’
(,Sanders 2005: p.27). By releasing code and compilers, Linux and beta-realises and
source-code we open the prospect for not only a few in-house developers working
out solutions, but potentially millions of problem-solvers, many of which will

026463 215
perhaps not contribute much to the solving, but by shear brute force of statistics and
populations, some will make a difference. Realising the diffusion of millions of
Arduinos, Raspberry PIs, Intel Galileos and sensors open the chance of something
cropping up, that wouldn’t be found in a lab no matter what the collective IQ or
smartness of the staff. It is a different model to what was referenced by Kevin Kelly
when he spoke to the distributed nature of Jellybeans, microchips in everything. This
was a more incipient view of distributed communication and processing power,
where small dedicated chips were placed in objects by manufacturers. The new
model is the diffusion of processers and sensing technology so that users, hobbyists
and children in schools etc., can consciously develop or toy with the ‘internet of
things’(IoT).

Homo faber is often considered in contradistinction to homo adorans - the pre-


enlightenment ‘worshiping man’ whose entire destiny and agency came under the
jurisdiction of God in all he encountered, experienced and thought. Free will was not
granted for the purposes of bettering one’s lot per se, or expressing one’s
individuality or taste, but for showing that under his or her own volition that they
devoted and developed themselves in accordance with religious teachings and given
interpretations of scripture. The ultimate purpose of humankind was then to worship
God. That ‘something beyond’ may translate to today’s internet driven hive or
crowdsourcing mentality, as tied to this idea of ‘man the maker’ is Homo Ludens – is
‘man the player’ or another flavour Homo Investigans ‘man the investigator’. Barret
(2010) puts forward that: “Humans have always poured enormous effort into solving
puzzles and playing.”213 Currently emerging estimates for people engaging in digital
games playing is nothing less than phenomenal, much of this driven by games being
made easily available to people [seems like every other internet pop-up is an online
game], and whose playability is enhanced by digital devices and network
communications. In 2011, games researcher Jane McGonagall made an astonishing
claim; “5 million gamers in the U.S are spending more than 40 hours a week playing
games — the same as a full time job!” 214 An article in the Guardian newspaper

213
Supernormal Stimuli: How Primal Urges Overran Their Evolutionary Purpose
By Deirdre Barret W. W. Norton & Company
214
http://www.huffingtonpost.com/jane-mcgonigal/video-games_b_823208.html

026463 216
reiterated this when they reported that; “as a planet we spend 3bn hours a week
playing online games”215 Regardless of the veracity of these claims and it would be
difficult indeed to ignore that an immense amount of time, human consciousness is
focussed upon playing which is now largely facilitated by digital technology and
communications. Along with other facilities opened by the profusion, capacity and
immediacy of digital communications has been to open new forms and intensity of
social interaction. Along with social networking and messaging, SMS, email, online
dating services lies the ability to play unknown, unseen ‘others’ online, in an off-the-
cuff, ad hoc and casual basis in a totally unique manner which was not conceivable
previously. Again, the crowdsourcing potentials – the fact that millions of players
could become involved to solve global crisis is something that Jane McGonical toyed
with in one of her projects –A world without Oil – where insights were generated It
goes well beyond the kind of democratic experience afforded by the traditional
political voting system to truly channel and focus the real concerns and ingenuity of
millions of people.

‘Serious play’ has come to the fore, acknowledging that curiosity, our ability to ‘see
what happens’ of we do this or go here’ is learning’s built-in exploration bonus. 216
Humans evolved to depart from the safe and known, leave the received or accepted
way, to try things out, to get distracted, to take risk and generally look like we are
wasting time. However, at least since Thomas Kuhn, who some considered as the
patron saint of STS, we have come to understand that science and scientific thinking
as a social and cultural activity only progresses through those who may be seen as
wasting time today, but are contributing to the learning algorithms which will
become useful tomorrow. What is we learnt by chance today will come to fruition
when background or foreground shifts. When we consider that business as well as
science and technology development always entails risk then we begin to understand
the place of play and curiosity as essentials in the formulae for success.

215
https://www.theguardian.com/technology/2014/jan/25/online-gamers-solving-sciences-biggest-
problems
216
‘Serious play’ was the original title of a book by Michael Schrage and referred to the kind of
tinkering done in R& D of new products. It has come to be used in different contexts such as
refereeing to gamification of learning and business.

026463 217
However, other human motivation and drives have inspired innovation and change
beyond our ability and fascination with tinkering and making things. We cannot for
instance ignore the presence of homo aestheticus "aesthetic man" (Derrick Jensen
2006, or Dissanayake (1992) when he uses the term to suggest that the emergence of
art was central to the formation of the human species). How can we ignore how the
things we live with, consumer and use look, who they bring beauty to our lives. This
takes us back to the distinction of ‘form and function’ in design of things 217. Another
human distinction which is relevant to the study and work here is homo domesticus
"domestic man" - the human conditioned by the built environment; the urban
dweller, dependent on civilization and detached from the natural world [Oscar
Carvajal 2005].

Finally, last but not least there is of course two more categories homo economicus
‘economic man’ humans cast as rational and self-interested agents, and homo socius
‘social man’ - man as a social being. Inherent to humans as long as they have not
lived entirely in isolation (Peter Berger & Thomas Luckmann in The Social
Construction of Reality (1966). 218
All these variations of humankind interpolate and
are as real today as they were 2000 years ago. They are perianal. STS covers many if
not most of these variations of the human condition when it explores what happens
when firms and entrepreneurial individuals commit to exploring, making or
developing things, and what other people - citizens, regulators, committees,
consumers, users and participants act and do things with and around them. It is what
makes the field eclectic, diverse and rich. For instance through the classic STS work
217
Please note that I am not getting drawn into the debate regarding whether form follows function
etc. http://www.e-architect.co.uk/modern-architects
218
One other significant variant here is homo investigans" investigating man" human curiosity and
capability to learn by deduction, Werner Luck 1976. Our curiosity has us doing utterly unproductive
things like reading news about people we will never meet, learning topics we will never have use for,
or exploring places we will never come back to. The roots of our peculiar curiosity can be linked to a
trait of the human species call neoteny Robert DePaolo - 2007. Our lifelong curiosity and playfulness
is a behavioural characteristic of neoteny. Neoteny is a short-cut taken by evolution – a route that
brings about a whole bundle of changes in one go, rather than selecting for them one by one.
Evolution, by making us a more juvenile species, has made us weaker than our primate cousins, but it
has also given us our child's curiosity, our capacity to learn and our deep sense of attachment to each
other. Our extended childhood means we can absorb so much more from our environment, including
our shared culture. Even in adulthood we can pick up new ways of doing things and new ways of
thinking, allowing us to adapt to new circumstances. even the best learning algorithms fall down if
they are not encouraged to explore a little. Without a little something to distract them from what they
should be doing, these algorithms get stuck in a rut, relying on the same responses time and time
again.

026463 218
offered by the likes of Trevor Pinch and Wiebe Bijker, or foundational compendiums
such as the Social Shaping of Technology, or through the actor-networks theory of
Bruno Latour, John Law and Michelle Callon, the reader come to view scientific
endeavour and technology development as a much broader set of imperatives than
simply a gathering or a connecting together of apparatus, theories and know-how.
We are guided through a much enlarged scientific and technological world beyond
materiality test tubes, strange machines, dials, levers, scopes, screens, nuts and bolts,
gears, engines, interfaces, couplings, valves and CPUs put together in configurations
that makes money, saves lives or wins grants or draws prestige. It includes
communications, the use of obscure and technical language, semiotics, symbolism
and meaning-making and meaning creating. STS show how some things work for
some of us, some of the time, or, and, for others, they simply come to exist as part of
the scenic environment, a partially conscious or unconscious background to lives or
experiences at work or in transit or at play. For others there may be very active or
very conscious resistance towards consuming or using things, and for others a
complete failure to be able to access things, a failure to consume and use as lack of
availability removes facilities, products and services from the map of experience, or
there is a lack of material resources and capital to subscribe or purchase, or by lack
of knowledge, user support or training.

It is beyond the scope of the present work to reconnoitre all the conceivable origins
and alternative models and theories of consciousness with relation to life, technology
and everyday life, however suffice to say at this point that following Polyani, some
knowledge is explicit and is able to be explicated and some knowledge is tacit and
implicit, and this most definitely relates to the presence, the sense-making and use of
technology. People are conscious of particular or certain artefacts, devices services
etc. Or they may be partially aware of them, or they have no awareness of them at
all. I hold that this is as true on an individual level, as it is at meso, group,
organisational and institutional levels, as well as macro- levels in terms of entire
cultures or societies. Consider that a Rolls Royce car forms part of the environment
of things for many of us, but there are blocks and barriers, not least of them of an
economic nature or even an aspirational nature (I would prefer a Lamborghini),

026463 219
which prevent us from accessing or using it. How we use things, when, where and
why we use them, how often we use them, under which conditions and according to
which motivations, all of these and more characterise the use of devices, artefacts,
decorations, tools, services that we live with on a daily basis.

An angle grinder, while making a probable contribution to the building you live or
work in may not be a familiar device, while existing at the same time serving as a
familiar sight as you idly gaze at workmen engaged in construction on a building site
on the way to work. You may have an idea of what it is used for while never even
having the cause or inclination to handle such a device. Many technologies, many
tools are left to the hands of ‘experts’ and again, they form only a scenic part of
everyday life. Even domestic devices like toasters or toasty makers sit idle in
kitchens or kitchen cabinets until the inclination or idea arises to use them even
though they required sophisticated technologies, material processes and expertise to
make them. Unlike the manner in which we discuss ‘technology’ in STS, in everyday
life people hardly address devices, tools, artefacts as generic phenomenon, rather
they think, as Heidegger illustrated to us, or Gibson of the affordances or purpose of
things, or the activity they allow. As Pea (1991, p.51) notes “”Affordance” refers to
the perceived and actual properties of a thing, primarily those functional properties
that determine just how the thing could possibly be used. Less technically, a
doorknob is for turning, a wagon handle is for pulling.” While again I stress that will
not be discussing consciousness at length I will be returning to this idea later in the
discussions offered regarding the use of technology. Suffice to say here is
foundational in this discussion is the analogy of the hammer used to the philosopher
Heidegger when he spoke of ‘breakdown’ and how we only become aware,
conscious, of the technology when it fails to perform its designated or, rather,
assumed or presumed function. It is interesting to acknowledge that the philosopher
Martin Heidegger used a familiar example of a simple technology, the Hammer, one
which is likely to be known by most people, regardless of where they live.

026463 220
Conscious –
perception/exp
erience, short
term

Collective consciousness Individual Consciousness


(Shared meanings and values, (Personal experiences)
received, picked-up, or learned )
Individual and
Socio-cultural
Cognitive

Collective Individual
Unconscious Unconscious

Perception and action Intersubjectvity

Unconscious –
memory,
intuition

Participation, dialogue, interaction, and open source

Democracy as compared with other ways of life is the sole way of living which believes
wholeheartedly in the process of experience as end and as means; as that which is capable of
generating the science which is the sole dependable authority for the direction of further
experience and which releases emotions, needs and desires so as to call into being the things
that have not existed in the past. For every way of life that fails in its democracy limits the
contacts, the exchanges, the communications, the interactions by which experience is
steadied while it is also enlarged and enriched. The task of this release and enrichment is one
that has to be carried on day by day. Since it is one that can have no end till experience itself
comes to an end, the task of democracy is forever that of creation of a freer and more humane
experience in which all share and to which all contribute. Creative Democracy – The Task
Before Us By John Dewey Taken from: http://www.beloit.edu/~pbk/dewey.html

026463 221
Terms such as ‘democracy’, ‘participation’ and ‘dialogue’ have superseded
‘interaction’ and even ‘open source’, and now come to dominate the discussion that
surrounds the so-called socially engaged practices of the late 1990s and the 2000s
over a wide range of fields ranging from art, to social and material sciences. All of
these terms also have significance in terms of just how conscious people are
regarding purposes when thinking of purchasing or otherwise acquiring technology,
and with respect to ideas governing their use.

These terms surfaced most noticeably in the fields of interactive media and
computing design, and with particular reference to the employment of social science
research and theories to inform the design of digital interactive media and
communication systems, interfaces and devices. And within the social sciences the
particular focus on interpretative and reflexive social science such as ethnography.
According to such rhetoric, the software developer, producers, artist, or scientist is an
engineer of not only devices and their functioning – i.e. their operability and
interoperability - but rather situations, happenings, put simply they are setting up
contexts in which passive spectators and users become active participants and co-
creators within the worlds or spaces or data that the system and the devices enable.
An engineer in this frame is not simply a dabbler who connects pieces of machinery
together to produce a result, but a heterogeneous talent boasting considerable
financial acumen, adept and delft at promotion and publicity, and has a talent for
extrapolating how things are done in the real world, and how this can be better
managed, replicated, or even replaced by making it digital. They are as adept with
mathematical formulae, CAD drawing, as they are with people skills and pitching
new ideas. The social scientist working within this mindset shifts their focus from
some imaginary aloof perspective on society, perhaps the proverbial ivory tower
view, to considering their place in the social, communicative and symbolic processes
unfolding in a given research and activity situation. Much of the work of the likes of
actor-network theory is to do with the ‘ruthless application of semiotics’ or the
‘dissolution of binary opposites’ included here must be the fact of the relationship of
the researcher to that being researched.

026463 222
“Truth and falsehood. Large and small. Agency and structure. Human and
non-human. Before and after. Knowledge and power. Context and content.
Materiality and sociality. Activity and passivity…all of these divides have
been rubbished in work undertaken in the name of actor-network theory”
(Law 1999, p.3)

In a sense it is a quantum view of the relationship, and largely owes its origins to
Thomas Kuhn, although I recognise other legacies which are also grounded in
systems theory, and the structuralism of Anthony Giddens (1979).

Increasingly, these actors are shifting scale from rethinking artifacts (whether
buildings, posters, policies, televisions or mobile devices even toaster) towards what
may be considered whole systems thinking. In the art world for instance one of the
resonant themes of the 1990s was relational aesthetics. Bourriaud argues that art of
the 1990s takes as its theoretical horizon “the realm of human interactions and its
social context, rather than the assertion of an independent and private symbolic
space” (RA, p. 14). In other words, relational art works seek to establish
intersubjective encounters (be these literal or potential) in which meaning is
elaborated collectively (RA, p. 18) rather than in the privatized space of individual
consumption. Rather than a discrete, portable, autonomous work of art that
transcends its context, relational art is entirely beholden to the contingencies of its
environment and audience. The beginnings of internet culture, instant
communication, and the instantaneous gain and loss of celebrity and fortune, gave
rise to a new sense of meaning to creative work and creative products. They are also
becoming more poignantly aware of meaning with respect to objects, services,
consumption and usage219. Semiotics and semantics also play a role in this, as a
product or service may be chosen not only on its functional basis or how it looks,
smells and feels, the so-called traditional form and function distinctions, but also
upon its perceived impact on the environment and how one may dispose of it
(ecological issues), on the lives of others (by whom and where it was made, and
under which conditions it was made - fair trade, globalisation and localisation,
environmental friendly, renewable, etc). This changes the scope and scale regarding
what designers must have in mind as they approach and make decisions. Arguments
219
Krippendorf, K. 2006. The Semantic Turn: A New Foundation for Design. Boca Raton, FL: CRC
Press.

026463 223
and rationalisation may include a much wider set of imperatives, exigencies,
allowances and constraints than in the past. These approaches are also marked by
what the Japanese term Gemba – which loosely translated means ‘in the place’. In
the parlance of theorists of this it comes under various guises but mostly referred to
‘situational’ or ‘embodiment’. The ethnomethodology of Shutz and Garfinkel, or the
practice of actor-networks stress the ‘situation’, ‘embodiment’ and the
communications which take place en situ, in the moment, while in the act of doing or
performing, while participating, while consciously engaged, ‘in the flow’, naturalistic
and so forth.

In many respect it is what the documentary director dreams of when the subjects or
participants under study lose all consciousness of the camera or that someone is
looking, or that they are under the Lacanian ‘gaze’, and they behave as they would if
they were simply acting out there thoughts and feeling au natural. Just how much of
a contaminant, or an actant, or a mitigating presence the ethnographer has, has been
of interest to reflexive social science. The dissolution, or at least serious interrogation
of the value of grand narratives in society and education, especially with respect to
their claims of authenticity or representativeness of ‘what’s going on’, features in
much of the writing of this new turn in thinking about the human condition. Most
importantly it has drawn us, constructively, to focus upon how the immaterial,
relational, and communicative and symbolic aspects of material things, devices and
objects come to shape everyday life and its experiences. Most definitely, imperatives,
exigencies, allowances and constraints have always shaped design decision and
subsequent design products. A widely cited prescription from the Finnish Architect
Eliel Saarinen has it that the designer should: “Always design a thing by considering
it in its next larger context - a chair in a room, a room in a house, a house in an
environment, an environment in a city plan.” 220
The scope and scale of research
projects in to any subject remains important as it did in the carefully crafted
experimental study. This has concerned some people, including myself, when
consider approaches a such as actor-network theory and other kinds of interpretative
social science. If we equate ‘seamless webs of significance’ with the ideas that we

220
“- Eliel Saarinen (The Maturing Modern, 1956)

026463 224
beginning with interactive television and end with a butterfly beating its wings in a
Kansas field, then we are perhaps spreading our own wings far too wide and run the
risk of insignificance in our reports. This was perhaps the danger of post-modernism
and relativism in their tendencies towards becoming regressive. We need categories
and focus; it was the object of taxonomies to sort the world out (Liegh- Starr) and
make it containable, neat and in its place. The real world and its complexities needs
smoothed. And this has been a driving force of humans since the first observations or
the stars and the correlation of made-made structures from standing stones to
pyramids to well-kept and immaculate manicured gardens which ‘repel’ nature
through a range of carefully thought through and researched means. It is the goal of
statistics to reveal the hidden codes which provide insight into the immutable laws of
anything from extra-terrestrial communication, to the building blocks of the human
mind and personality. Order or imperatives change according to scope and scale.

If we shift scale from 10^1 to 10^5 we shift from a given artefact to its place and
identity in the greater scheme of environmental thinking serviced and made ever
more conscious by the rise of networks and globalization. It is the same in terms of
time and temporality. What is imperative in the next 5 mins may not be in any way
relevant viewed against the next million years. An interesting experiment called the
Long Now has been designed with the intention of drawing attention to the short
termism which characterises much of modern life. Also proponents of Big History
have also attempt to bring to cognition and awareness, the epic and relative
dimensionality of time, human and geographical history. Both the Long Now and
Big History are attempts at a panacea towards a kind of modern view that we have, in
a relatively short time, worked out how everything operates or works. It couples with
threats to business and investment realities which aim towards making a ‘quick
buck’, or produce ‘fast turnaround’ at the expense of developing quality, long lasting
products which can be maintained. With this in mind it is impossible to design things
in isolation from the larger systems that they live within — whether those are
systems of resource extraction, manufacture distribution, consumption, or waste.

Since our technical systems are wholly mixed up with our natural systems, that

026463 225
creates additional levels of complexity. In order to design within these confounding
contexts, we need to be able to scale up and scale down as we design: to consider
both the granularity of the things we are designing as well as ever larger contexts
within which they exist. For instance, a designer considering home entertainment
may start at the level of 10^1and consider only the discrete television set. But a
television set on its own is pretty much a redundant box. As we zoom out in scale, to
consider all household activities we realise that it is essential to think not just of the
television, but its relationship to other, potentially competing modes of pastimes,
communication and entertainment – games boards, music players, mobile phones and
devices, computers - that may determine the speed and feasibility of how people
amuse and educate themselves. It may also consider how the television can serve as a
hub for controlling or monitoring other devices. Zoom out to 10^3, to the street or
neighbourhood, and then we understand the dynamics of the neighbourhood, and the
impact that local facilities have on liveability, public health, or retail viability. Do
people in this middle-class neighbourhood value book-reading over television use
and so forth, can a share system benefit the use of lawnmowers, any warning through
neighbourhood watch etc. At 10^4 the metropolitan area comes into view, and so do
issues of local versus mass media, local things for sale, local fetes and other activities
and events, schools etc. At 10^5 we confront national policies that subsidize and
regulate connectivity and the suburban sprawl that often results, frequently at the
expense of urban density and telecoms and broadcasting licencing and regulation. At
10^6 we encounter unexpected connections between television use and a national
obesity epidemic. At 10^7 we realize that our per-capita ownership rate of devices,
all powered by fossil fuels, is unsustainable as a global lifestyle export. The point
here is not to get mesmerized - or worse, paralyzed - by the vertiginous ascent
through layers of compounding complexity and potentially limitless connections and
linkages. This amounts to what John Law (2003) would condemn to the mess of
research. Instead, being aware of these layers instructs us that before we design or
create policies, we must try to take account of all of these layers (and there are many
more) if we are going to intervene meaningfully into the seemingly simple and local
issue of mobility. An artist, designer, even scientist or comedian that can do this, will
stand a better chance of making at least more relevant observations if not better

026463 226
products or clever and witty jokes.

The challenge, then, is to deploy the necessary conceptual analytics to figure out at
what scope and scale it is optimal to intervene or study, or what theoretical apparatus
you will provide to frame the phenomena under focus, in order to produce
‘convincing’ results. But it is also here that we run the danger of veering back from
whence social science and its imperial view on ‘the social’ have come. Veering back
into the idea the imperious view that somehow, like a soothsayer or prophet, they
have a privileged view, that nobody else has or could have [who haven’t been
trained], on the ‘truth or reality of what is going on’. That their theories and
prescriptions need to be adhered to if the individual or firm stands any change of
‘getting it right’. That by reading the latest Harvard University Press release or that
Harvard Business Review article that they will become better managers or designers
or make more profits. That by applying this or that new management fad, or follow
these rules of thumb in usability studies they are guaranteed to turn their business
around and be more effective and efficient.

In the STS treatment of technology, devices tools, artefacts materials are lent
markedly different accounts regarding the inspiration, operation, motivations and
exigencies of their design and manufacture, certainly compared to the commercial
producer’s accounts, or that of consultancies and others who offer prescriptions
regarding how to do, or what to look out for in design, management or business .
Also in the employment of methods, there may be completely different approaches to
a given phenomenon, be it social, cognitive or material, to that employed when it is a
matter of academic study as opposed to being a task or an object in industrial
practice. Industrial practitioners may for instance prefer analytical ‘tools’ and
‘techniques’, prescriptive methods [such as the house of quality, Gant charts,
balanced scorecards etc.] as opposed to qualitative and quantitative research methods
such as ethnographic studies, ethnomethodology, grounded theory, and the raft of
statistical methods for dealing with qualities of something. The terms ‘tools’ and
‘techniques’ are fitting analogies and correlates to the industrial effort over the term
‘methods’. They like hammers and lathes, or software components, are things that

026463 227
can be bought off the shelf and ‘plugged-into’ process and activities. They are as
persuasive and settling as pie-charts and quantitative statistical data which provide
easy-to-digest or implementable, unarguable, unambiguous information through
which to make decisions. This not to say that data does present valuable insight on
the meso and macro levels which are difficult to observe at the individual level. This
of course goes back to the very origins of sociology through the work of Durkheim in
his analysis of suicide. But dwelling, unreflexibily, upon the big picture, or the big
data, emphasises the god’s eye view, and opens the prospects of category errors and
mistaken erroneous models and views of what is happening or what’s going on at
that particular scope and scale. Simulations are "ideal for developing an
understanding of big ideas and concepts - those things for which experience alone
can deepen understanding." In this sense they are useful in a similar sense are data
and statistics.

For instance through the classic STS work offered by the likes of Trevor Pinch and
Wiebe Bijker, or foundational compendiums such as the Social Shaping of
Technology, or through the actor-networks theory of Bruno Latour, John Law and
Michelle Callon, the reader begins to see scientific endeavour and technology
development as a set of much wider concerns than simply a gathering or a
connecting together of apparatus and ideas. It becomes more than test tubes, and
strange machines, dials, scopes, screens, nuts and bolts, oscilloscopes, gears, engines,
interfaces, couplings, valves and CPUs put together in configurations that makes
money, saves lives or wins grants. They show us how some things work for some of
us, some of the time, or, and, for others, they may exist simply as part of the scenic
environment, a partially conscious or unconscious background to our lives and
experiences at work and at play. It is beyond the scope of the present work to
reconnoitre the origins and alternative models and theories of consciousness, suffice
to say here that following Polyani, some knowledge is explicit and is able to be
explicated and some knowledge is tacit and implicit. I hold that this is as true on an
individual level, as it is at meso, group, organisational and institutional levels, as well
as macro- levels in terms of entire cultures or societies. Consider that A Rolls Royce
car forms part of the environment of things for many of us, but there are blacks and

026463 228
barriers, not least of them economic concerns, which prevent us from accessing or
using it, them. Similarly, an angle grinder, while making a probable contribution to
the building you are in right now, may be an unknown to you while at the same time
a familiar sight as you idly gaze at workmen engaged in construction. While I will
not be discussing consciousness at length I will be returning to this idea later in the
discussions offered regarding the use of technology. Suffice to say here is
foundational in this discussion is the analogy of the hammer used to the philosopher
Heidegger when he spoke of ‘breakdown’ and how we only become aware,
conscious, of the technology when it fails to perform its assumed function.

Collective consciousness Individual Consciousness


(Shared meanings and values) (Personal experiences)

Intersubjectvity
Socio-cultural Individual

Collective Individual
Unconscious Unconscious

026463 229
Adam Smith in The Wealth of Nations tell us that: “Consumption is the sole end and
purpose of all production; and the interest of the producer ought to be attended to
only so far as it may be necessary for promoting that of the consumer.” 221
But as
producers, designers, scientists, live on the same planet, within the same socieities
and culture, and share very similar traits to consumers, users and citizens, products
and scientific breakthroughs derive from the same cultural norms and conventions.
There is room for drawing correspondences, consumer, users, designers, producers,
managers, investors and academic researchers should find some common grounds to
explore ‘convincing’ arguments, to genuine, actual and real issues, problems and
experiences. The relationship is not symmetrical in terms of focus and dwelling.
Consumer-users may give no more than a passing thought of concern regarding who
made the object on the shop shelf, why they made it, and what problems they faced,
who funded it and so forth. However, the consumer-user is a focus for the producer
and designer, even though they are reified and often ideal. Just how much the
producer and designer are consumed with their product, its sheen, its brilliance, its
seductiveness, its allure, its mystery, and so forth, and how much this is empirically
sensed by the consumer-user is the realm of post manufacturing market research, and
of course feedback and the predilections of buyers, distributors and wholesalers, and
ultimately sales figures, clicks and ‘likes’.

STS are interested in the how and why these phenomena come to take their place in
our material and social world, and how they come to shape, characteristic and how
they characterise new social, mental and communicative worlds of enterprise,
endeavour and use. The STS accounts typically rest upon contrast or contradict the
often optimistic, condensed or myopic press releases and the enthusiastic histories
sometimes to the point of quirkiness. They compare with the grandiloquence of those
histories of scientists, inventors, and their science and the flamboyant accounts of
technology ‘breakthroughs’ and ‘advances’. But nevertheless even they obey a
general tenet, a hidden rule or convention typical of all science and technology
221
Smith, Adam. 1776. An Inquiry into the Nature and Causes of the Wealth of Nations. London:
Methuen & Co (Book IV, chapter 8, 49)

026463 230
writing in that it must surprise, that it must stand out, that it must inspire awe and
marvel, and that it must compel, contradict or it must be in some way challenging or
be controversial. Otherwise we could ask, what is the point? Just what is the reveal,
what is the relevance to the wider economy or practitioners, or general public
interests that do not just analyse but have to commit time, money and effort to make
things, connect things, and configure things? Why should I buy this book or
download this journal article?

Certainly, in inventive and scholarly work there are acts of intrinsic motivation,
people tinkering, D.I.Y cultures, hobbies and pastimes and so forth. Science was
once the prevail of those gentleman in wood panelled rooms, men of letters, who
identified themselves with pondering on life, nature and meaning of things. They
may deny the sense of economic purpose, and may be considered more as consumers
than out and out producers. What is the point of the STS accounts to other scholars
and other disciplines? God forbid that the accounts are mundane, become overly
common sense and ordinary, offering little to no analysis or tips for best practice or
little shock value, or background/foreground shift.

Even those who stand in contradistinction to what may be considered the “the
innovation-centric account” of technology, such as Mike Michael and David
Edgerton surprise us by drawing out the difference between technological
significance and technological novelty. “…the term ‘mundane technologies’
connotes those technologies whose novelty has worn off; these are technologies
which are now fully integrated into, and unremarkable part of, everyday life. To
study mundane technologies are thus to explore how they mediate and reflect
everyday life, how they serve in the reproduction of local technosocial configurations
… So, if the exotic is about change, transformation, production, becoming, the
mundane is about homeostasis, reproduction and being.” 222 Edgerton says that we are
wrong to associate technology so strongly with invention, and that we should think
of it, rather, as evolving through use. A “history of technology-in-use,” yields “a
radically different picture of technology, and indeed of invention and innovation.” It

222
(Mike Michael; Between the mundane and the exotic, p. 131-132: Sage 2003)

026463 231
certainly casts a very different historical castoff characters, instead of a world
increasingly built of bits and bytes, Saturn V rockets, self-driving cars and Higg’s
Bosun we have more condoms, and corrugated iron and Kalashinikovs.

“If you go to certain countries in the so-called undeveloped world and ask
them if they have any technology they will ruefully answer 'no'. By which
they mean they haven't invented anything important. I would maintain that's a
very odd definition of technology because it's neither a definition that looks at
what matters, technologically, nor does it focus on historically significant
inventions.”223

The point here is that adopters of ‘old and proven’ technology in developing
countries, where the emphasis is upon maintenance and repair rather than invention
also have a deeper understating of the technology. This comes through its
employment into essential processes of everyday life, and other issues such as
values-cost benefits, and tolerances beyond the service light or demand flickering on
a two year old vehicle’s dashboard indicating, nay, dutifully reporting to the driver
that it is time for new filters, oil change and the such like.

Stephen Doheny-Farina showed us that the technical and commercial processes of


turning technologies into products are, in significant ways, all communication
processes (Doheny-Farina, 1992). Technology transfer –moving knowledge and
expertise from one organisational entity to another - is but a convoluted process of
social gatherings to chat about what development work can be done, can’t be done,
won’t be done, and should be done [by someone else? – how many European Union
funded technology and research projects are like this?]. But these communications
are always studded and embellished, and especially so in the case of new technology,
which nearly always carries distinctive meanings and connotations. It does especially
so when garnering support from senior management for a project, doing the rounds
with Silicon Valley venture capitalists and other potential funding agencies, and
when releasing quarterly performance reports to the public or showing off at a trade
show and so forth. Just like the way we dress, or adorn our homes gives some
indication not only of our tastes and distinctions, but also of us, our personality, who

223
https://www.theguardian.com/technology/2006/aug/01/news.g2

026463 232
we are, and so it is with new technology.
“A so-called new technology is the object of fascination, hyperbole and
concern. It is obviously a field onto which a broad array of hopes and fears is
projected and envisioned as a potential solution to, or possible problem for,
the world at large. Technological development is one of the primary sites
through which we can chart the desires and concerns of a given context and
the preoccupations of particular moments in history. The meanings that are
attributed to new technologies are some of the most important evidence we
can find of the visions, both optimistic and anxious, through which modern
societies cohere.” (Sturken, and Thomas, p.1)

M.I.T. psychologist Sherry Turkle’s mid-1980s ethnography (for instance in Turkle


1984) was path breaking as it concerned itself with how ‘the micro’ and other
devices such as games machines were fashioning discourses even in the very young.
She came to term them ‘evocative technologies’ (a point that she took up later in
Evocative Objects: Things We Think With 2008) or the ‘subjective computer’ noting how
computer metaphors (e.g. the mind as a computer program) disseminate from
technical cultures to wider audiences, including young children [who were later to
become millennials who had grown up digital]224.

“…experiences with computers become reference points for thinking and


talking about other things. Computers provoke debate about education,
society, politics, and, most central to the theme of this book, about human
nature. In this, the computer is a “metaphysical machine.” (Turkle,
1984/2005, p.21)

Turkle, who has come on to give us several other deeply insightful and original
works regarding the shaping of the human condition in relation to technology
including Life on the Screen (1995) and Home Alone (2011), raises a central question
which is of lasting concern to STS scholars, especially those who advocate ‘citizen
science’ and its derivatives.

“How do ideas born within the technical communities around computation


find their way out to the culture beyond? This is the province of the sociology
of knowledge. Ideas that begin their life in the world of science can move out;
they are popularized and simplified, often only half understood, but they can
have a profound effect on how people think.” (Turkle, ibid, p.26)

224
Don Tapscott

026463 233
Turkle’s work is important as she attempted to prize us from a direct focus on
technology and what we do with it, to it, to consider more profoundly and at a meta-
level what it actually does to us. While she may not have been the first to draw
attention to the mutual and co-shaping aspect of technology and the human condition
but she certainly showed convincibly how meaning-making and communication,
technical rhetoric, is conditioned by technology use and conceptualisation.

While more nuanced, convincing and commonsensical than the great man accounts
of Edison, Bell, Logie-Baird and Steve Jobs and ‘how they changed the world’, the
work of the likes of Sherry Turkle, and the narratives provided by STS also sharply
contrast with the short, to the point, optimistic reports offered by the publicity
departments of technology companies and their effervescent press releases. Most
newspapers have drastically cut newsroom staff in recent years, meaning fewer
reporters and assignment editors remain to unearth story ideas or sort through press
releases.225 What this means in practice is that well written press releases
subsequently transmit more or less verbatim without attempts to savour and digest to
the wider public via condensed media reports, and so it is welcoming to read
alternative accounts, not necessarily more or less optimistic, or neutral, but certainly
alternative and in depth. But what is missing here is the sense that research is writing
and writing is research, i.e., that ‘… all our research activities are rhetorical in
nature’ (Doheny-Farina, 1993: p. 254). So this piece may be considered my
contribution to the corpus of the illustrious, fabulous and fantastic tales of technology
and enterprise.

This arrangement also beggars another deeply troubling question of social research in
business and design. In a sector which has been concerned to the point of obsession
with commercial secrecy and intellectual property (IP) concerns, how many
management, business and innovation researchers come to be fobbed off in their
sometimes hard won research visits with becoming almost passive recipients of the
organisational voice, the carefully vetted company spokesperson and the ‘party-line’.
The researcher as a dutiful social scientist approaches wishing to seek out at least
225
Mallary Jean Tenore July 25, 2012 http://www.poynter.org/2012/6-ways-journalists-can-use-press-
releases-effectively/181207/

026463 234
something of the ‘dirt’, ‘the truth’ or ‘reality’ of hidden ‘inner’ processes, the
underpinning stories of the successes, obstacles and failures of the enterprise or
bureaucratic effort, and leaves the building with the sanitised, neat, progressive story,
enshrined in the ‘grey literature’ that was also given alongside the promotional or
motivational speech. Both communications, verbal and written, inevitably obey the
canon of the happy ending or the Whig history sprinkled with ample hints of
professional poise, control and demeanour.

“Parts of the world are caught in our ethnographies, our histories and our
statistics. But other parts are not, or if they are then this is because they have
been distorted into clarity. This is the problem I try to tackle. If much of the
world is vague, diffuse or unspecific, slippery, emotional, ephemeral, elusive
or indistinct, changes like a kaleidoscope, or doesn’t really have much of a
pattern at all, then where does this leave social science? … Perhaps we will
need to rethink how far whatever it is that we know travels and whether it
still makes sense in other locations, and if so how. This would be knowing
as situated inquiry. Almost certainly we will need to think hard about our
relations with whatever it is we know, and ask how far the process of
knowing it also brings it into being. And as a theme that runs through
everything, we should certainly be asking ourselves whether ‘knowing’ is the
metaphor that we need. Whether, or when. Perhaps the academy needs to
think of other metaphors for its activities – or imagine other activities …
Many now think that ethnography needs to work differently if it is to
understand a networked or fluid world. The sense that knowledge is
contexted and limited has become widespread.” (Law, 2004, p.2-3)226

Steve Fuller is another scholar who alludes forthrightly to this arrangement. He sees
it in the culture of science reporting in; “the control that scientists continue to exert
over how their history is told. Past diversions and failures remain largely hidden,
resulting in an airbrushed picture of "progress" otherwise absent from human
affairs.” Just how many of the narratives given to researchers are cosmetic,
manufactured and manicured, that come in absence to any other data or accounts, to
be written up as verbatim and factual? To which degree do they, in fact, deviate from
the objective version of affairs is anyone’s guess. Reflexivity is not noted much in
business and management research, but recent reports of fudging and manipulation
of data and statistics - fabrication, falsification, “cooking” of data, etc. (Fanelli,
2009) - in both social and the harder sciences – have recently undermined confidence

226
Law. J. (2004) After method

026463 235
in any claims of precision, disinterest, or objectivity in science. 227
In technology
development, given the importance of drawing funding and support which is just in
every way, or even more so, precarious as funding for scientific endeavour it is
understandable. Again, this is of course difficult if not impossible to ascertain, and
what gives science its bad name. Much of the time such studies which strive to be as
cutting-edge, contemporary, fresh, vibrant and new as the technology is itself, do not
benefit from the hindsight, critical, sceptical or disinterested perspective of
historiography, and having the luxury to consider the influence of competing social
groups. I have also had to sign disclosure agreements in order to get access to
technology firms. This censorship has had a great impact on the ability to compete in
an academic environment whose value resides in publications. Speaking in the
theoretical, general and abstract is not welcome when what people and editors want
is objective, factual accounts of what is going on. Where one is bound by these
agreements the latitude for publishing detail is restricted forcing the researcher to
revert to generalisations and ‘theory-building’. It is largely left to the reader, critic
and peers to decide or arbitrate degrees of relevance, authenticity and
representativeness.

This is a serious question which will not be fully addressed or answered in any
satisfactory way here, although it opens a line of inquiry which is of great importance
to research practice as it persists to haunt ‘in the here and now’ or what I have
previously referred to as ‘what’s going on’ organisational and innovation research. In
a sense it bolsters the advocacy of a richer, contextual forms of research, as its details
and scope, its internal integrity would be much more difficult to fabricate based upon
pure ingenuity and imagination.

It will however surface implicitly and repeatedly in the discussion offered in terms of
how technology has come to be portrayed and represented within and to different
social groups. In the case study presented here one of the defining conditions of the
227
In 2009, Daniele Fanelli of the University of Edinburgh carried out a meta-analysis that pooled the
results of 21 surveys of researchers who were asked whether they or their colleagues had fabricated or
falsified research. Fanelli D (2009) How Many Scientists Fabricate and Falsify Research? A
Systematic Review and Meta-Analysis of Survey Data. PLoS ONE 4(5): e5738.
doi:10.1371/journal.pone.0005738 see also
https://www.theguardian.com/science/2012/sep/13/scientific-research-fraud-bad-practice

026463 236
time suggested that digital communications could roll out to the public in a number
of quite different directions and scenarios. The first issues being which kind of
communications infrastructure would bring the ‘information superhighway’ to the
home. Many of these were viable, and none were as farfetched as some of the
contraptions of those wonderful men in their flying machines in the early part of the
century. A much more diffuse and mobile internet was hardly envisaged, rather the
focus and concentration was upon the home viewed as the centre of entertainment
practice and personal [and increasingly work related – i.e. through home working]
communications.

An industry report during the time (1996) 228 detailed some 135 different trials of
interactive television which were taking place at the time. They used vastly differing
technologies of varying sophistication to achieve the goal of making television an
interactive experience. They varied in terms of technical infrastructure, the interface
and in terms of services offered, but it is hardly presumptive to say that the
motivations which drove these trials included the promise of accessing the mass
market potentials of television as what I have already described as a ‘Trojan horse’.
It can take the idea of lifestyles, products and services into the home, and by making
the television interactive, can permit users to order and buy what they see. Each and
every one of those trials held firmly in their sights the golden goose of a more global
roll-out of their particular solution. If they could crack the ‘killer application’ – the
approach which would be adopted by the large media and communications
corporations, and then they would win big.

“Everything is constructed in a specific historical context and there can be no escape


from history.” So tells us John Law (2004; p.5), suffice to say here with respect to
writing and weaving stories of science and technology, there is the broaching of the
gap between pasts, presents and futures. I use the plurals intentionally here in the
vein of Steve Fuller (c.f. Fuller 2006) when he puts forward that STS rests on three
main disciplines, and they are history, philosophy, and sociology of science. “History

228
“Interactive Services-realistic Expectations, An Analysis of Video on Demand, Video Dialtone,
The Internet, On-Line Services and their Applications” Dittberner Associates, Inc. Bethesda, MD.
1996

026463 237
supplies the raw material that is initially understood in terms of sociological
categories, which philosophers then ‘justify’ in the relatively limited sense of
offering a recurrent rationale that the historical agents could accept as their own.”
(Fuller, 2011; p.1)229

“Social Science writing is sociohistorically constructed. The writing uses


rhetorical and literary devices to create and sustain values and to convey
cognitive content. To ignore the ubiquitousness and power of these practices
is perverse and short sighted – not to mention unscientific.” (Richardson,
1997: 46)

Elsewhere Fuller notes the play on time and the historical record in Latour’s (1999)
explanation of Louis Pasteur.

“the very idea that social construction might extend backward as well as
forward – that we might ‘change the past’ -- easily offends epistemological
sensibilities, as Bruno Latour learned when he took symmetrical changes in
time to be a consequence of the claim that microbes did not exist before
Pasteur. In effect, Latour wanted to argue that over time Pasteur not only
cleared the way for today’s understanding of the nature of disease, but also
successfully backdated the historical record, as Winston Smith was employed
to do in 1984’s Ministry of Truth, to make it so that microbes have always
existed (Latour 1999: chap. 5).” 230

Rosalind Williams has also alluded to this when she asserts the fallacy of presentism
in order to analyse the past.

Every historian is warned against the fallacy of “presentism,” the


inappropriate projection of present concerns and assumptions onto earlier
times. Every historian also knows that presentism is inevitable. The only
platform we have from which to view the past is the present: there is no
perspective that lies outside of time. In this sense, the whole enterprise of the
history of technology has been one grand exercise in presentism. It has raised
to a level of consciousness and dignity the creation and maintenance of the
built world, and the production of useful systems and objects, from the
perspective of a century in which these activities have become dominant as
never before in history. (Willams, 2000)
229
Fuller, S. (2011). Why does history matter to the Science Studies disciplines? A case for giving the
past back its future. Journal of the Philosophy of History, 5(3), pp. 562-585.
http://wrap.warwick.ac.uk/46529/1/WRAP_Fuller_JOURNAL%20FOR%20THE%20PHILOSOPHY
%20OF%20HISTORY%20-%20revised.pdf
230
Fuller, ibid, p.2

026463 238
One of the defining differences for me in terms of distinction between advance in
science and technology is that the latter is consciously teleological in nature. That is,
most, if not all, technology is built considering future uses and usages. It is less
‘discovery’ and much more ‘design and planning’. Analogously, it is more a
packaged holiday, with guided excursions, even pre-defined meal times, than a
wander through a foreign town without a map or guide book, where one may or may
not speak the local language. As in scientific endeavour, at the onset of the project
there are no absolutes, however, there is a continuum between the two extremes.
Carrying the analogy, they are dependent upon the ‘foreignness’ of the place being
visited, and the inclinations, dispositions, motivations, and inspirations of the
traveller. It depends much upon their previous experiences and what they learned or
gained from it. Surely, Edison was driven on, and guided, as was Steve Jobs, by
previous successes and failures, and these conditions there aversion to risk amongst
other things.

The domestic and the personal

Firms also use knowledge of actual and potential markets. This has mainly to do with
socio-cognitive issues of who buys what, why, where and when and for what
purpose. The focus may be upon one’s own existing customer base, or that of a
competitor or partner or of an entirely new target market or segment [such as
computer interfaces for various assistive technologies and handicaps or for different
age groups, or even cultures and parts of the world, including the developing world].
Again this can inform or provide hints and clues of what to add, develop or adapt in
existing products (incremental innovation), or provide hints and clues with respect to
new markets for existing products and services, or even to seek out potential markets
for new products and services. This can also include making things cheaper, more
disposable, than adding or embellishing through new looks or new functions and
features.

026463 239
Finally there is also knowledge which can be generated to do with organisation.
Work, jobs and roles may jostle in order to make alterations in and around existing or
entirely new business strategies and processes (which loosely came under the rubric
of business process reengineering at one time). This can include reorgnisaing
routines and work to cater for the incorporation of new materials, plant, components
or machines (Orlinsk). This kind of research can be based within the firm but often is
the subject of external consultancy exercises and or academic research, and the aim
is to optimise productivity, efficiency and effectiveness that cater for the new
conditions, regulatory regime, socio-economic changes or new technology. This has
a legacy which goes back to at least the institution of ‘scientific management’.

In short these projects of knowledge generation within the firm is what we may call
(I follow consumer researcher Maurice Holbrook here) ‘managerially relevant’
information – i.e. knowledge for the purpose of making decision regarding
improving the product or service, its subsequent marketing, publicly and advertising,
and for its design, manufacturing or production. What is interesting to note here that
in many ways this all resonates upon a slightly wider lengthy on-going debate in
science and in science management, regarding ‘basic’ as opposed to an ‘applied’
science. Or this ethos can even be extrapolated to consider the debates regarding
wider role of the 21st century University and the ostensive promotion of STEM
subjects over the humanities and social sciences.231

If it is that STS can at least frame the discussion of ‘impacts’ that a policy, or a
discovery, or a new innovation will have on the way we do things, or how we
conduct our affairs in everyday life, then it seems important that this offered to
practitioners, those that will actually make discoveries, apply for research grants,
develop new technology, and those who will consume, use and live with the
outcomes. Perhaps it should not be pure acts of social observation, itself an act of

231
John medae we have taken a leadership role in the movement to turn STEM (Science, Technology,
Engineering, and Math) into STEAM in the United States by adding the “Arts,” broadly defined.
STEAM advocates for the federal government to integrate art and design with its growing emphasis
on STEM education and research. By doing so, we will develop the creativity needed to drive our
innovation economy forward and keep America competitive throughout this century(p.7). – (2013)
The Art of Critical Making Edited by Rosanne Somerson and Mara L.Hermano Foreword by John
Maeda John Wiley & Sons, Inc., Hoboken, New Jersey pp. 5-7

026463 240
‘basic research’ but much more a participative discussion and dialogue which would
have science and technology researchers working very closely pedagogically with
engineers and scientists and managers in training. In fact it is my contention that this
should extend all the way through education starting at primary school level where
the process of reification of the outcomes and processes of science and technology
begins. If you want grooming ‘citizen scientists’ should begin at primary school
Typically as it is taught right now in many UK schools, the perspective of science
and technology education focuses upon lessons which “allow children and teenagers
to be creative while also developing new skills and grasping an understanding of how
things work.”232 The focus is upon function ‘how’ rather than upon purpose or reason
or ‘why’. The focus is upon the working and components and how they fit together
and appear ‘miraculous’ rather than offering any explanation of what led to the need
for the device or the knowledge or socio-economic processes which built it. This
proposition constitutes a much larger and wider issue than what can be covered here
and will be taken up elsewhere; however suffice to say that many of the rigidities in
thinking of science and technology [and no doubt other subject domains] conceivably
begin with current schooling, pedagogy and curricula – what is presented as science
and technology and how it is presented.

This certainly not a recent nor is it a unique view. Kerr (1966) saw these new courses
as teaching science as a process of inquiry and having the potential for furthering a
feature of the UK’s Association for Science Education (ASE)’s 1961 policy
statement. It held then that: “science should be recognised – and taught – as a major
human activity which explores the realm of human experience”. I consider this an
enlightened view yet is this indicative of what is delivered at school or even
university level. Rather science and the work of scientists, and technology and the
work of managers, designers and engineers, has been to inculcate the expectation that
they produce 'reliable' ‘irrefutable’ knowledge and create wholesome devices which
represent the apotheosis of technical advance, that there are no alternatives and that
they, scientists, engineer or company, communicates its discoveries to society, and
get their products and services into the high street shops or other outlets and

232
http://www.cest.org.uk/importance-of-technology-in-schools/

026463 241
destinations. Simply because they are there, in the headline media, billboard,
magazine or high street store, they are legitimised, solicited and rightful.

As such it seems fitting that in academic studies of management and organisation,


and in wider social studies relevant to engineering and design there should be a
distinctive place for STS alongside the received and accepted histories of the subjects
and their wider societal, cultural, economic and socio-cognitive influences, as well as
their core curricula (i.e. the more ‘skills-based’ subjects and modules - marketing
focus, strategic management, cash-flow analysis, CAD design, mechanical
principles, material science, the use of diagnostic equipment, statistics and so forth).

“Engineering schools all want faculty members who do more than just build
things, who display theoretical insight into design problems. However, these
schools also understand the need to connect academic engineering with the
actual practice of building things. The resulting compromises lead to
grumbling from both sides of the divide. Engineers from the science-based
tradition worry that practice-oriented colleagues lack scholarly depth. Hands-
on engineers complain that their supposedly scholarly colleagues use fancy
equations to dress up unexciting content, and lack any feel for inventive
design.” 233

Disciplines by their natures as Thomas Kuhn suggests, is discourse, speaking of


particular ways of seeing and doing science at particular times, seek out those rules,
tenets, ways of thinking and operating, which allow for ‘normal science’ to be
maintained and to be conducted. “The limits of my language mean the limits of my
world.” Limits of my language mean the limits of my world.” Language mean the
limits of my world.” Limits of my world.” Ludwig wittgenstein , Tractatus (1921)

As Stephen Pinker has pointed out an inescapable fact about language: it changes
over time, adapted by millions of writers and speakers to their needs (Pinker,
2015)234, and as an interdisciplinary area of study, STS has revealed itself as
extraordinarily diverse and innovative in its approaches, “its findings and debates
have repercussions for almost every understanding of the modern world.”
233
Williams, R. (2000) “All That is Solid Melts into Air”: Historians of Technology in the Information
Revolution Technology and Culture, Vol 41, Number 4 (October 2000): 641-68.
234
Pinker, S. (2015) The Sense of Style: The Thinking Person's Guide to Writing in the 21st Century
Harmondsworth: Penguin

026463 242
(Sismondo, 2010, p.vii)235 As a relatively new and still emerging, regenerative
discipline it continues to innovate its approaches and epistemologies reflexively
within itself. Part of this innovative strand needs to be surely held with the ability to
interface with both the public and practitioners – engineers, scientists, designers, and
their managers and investors. This is perhaps the real benefits of working in the field
of the social studies of technology, we are not dealing with something which is
purely ephemeral, like a an idea, or of something which is difficult to empirically see
or witness like ‘the social’, we are dealing with the bridge between the socio-
cognitive and the material, between software and hardware.

Wynne et al, (2007) note that certainly within European discourse science and
innovations often are often seen by the public at least (and certainly school children)
as inevitable. Policies tend to focus on speeding up innovation and removing the
barriers to its progress. Little thought is given to the overall direction or orientation
of innovation, apart from that this is the prevail of a caring or entrepreneurially
biased government.236 Government is supposed to be proactive at the macro level
moving towards being more responsive and empathetic at the meso- and micro levels
of society. Etzkowitz and Leydesdorff (2000: p.111) use the notion of the triple helix
of the nation state, academia and industry to explain innovation, the development of
new technology and knowledge transfer.237

One imperative in theorisation of science and innovation has been to go beyond


simplistic linear models of innovation and diffusion to understand just what takes
place in particularised situations and circumstances. These are taken as a model
which views knowledge and know-how gleaned through research as moving
uninterrupted from basic science, typically developed and conducted in universities,
to evermore ‘further and advanced stages’ of development - design and engineering
235
An Introduction to Science and Technology Studies Second Edition Sergio Sismondo (2010)
236
Wynne, B et al, Taking European Knowledge Society Seriously: Report of the Expert Group on
Science and Governance to the Science, Economy and Society Directorate, Directorate-General for
Research, European Commission (2007), at www.proeco.si/upload/Taking%20european
%20knowledge%20society%2
0seriously.pdf (accessed 16 Dec 2008).
237
Henry Etzkowitz & Loet Leydesdorff, (2000) The dynamics of innovation: from National Systems
and ‘‘Mode 2’’ to a Triple Helix of university–industry–government relations, * Research Policy, vol
29, pp 109–123.

026463 243
to manufacturing to marketing to sales and into factories, workshops, and homes. In
design research ‘linear rationality’ was critiqued by Hickling (1982) who asked if we
could go ‘beyond a linear iterative process’ in design. Hickling considered the
similarities of linear models over various fields including management, science,
medicine, planning and design. In his real case examples he showed the complex
iterative and cyclic processes which are involved which deny neat tick-boxing when
it comes to how individual developments progressed.

Linear models whilst retaining some rhetorical value in policy and funding circles
have been largely disputed and discredited as one moves downstream to the
individual case, making related discourses which speak of the determinism of science
and technology in human affairs messy, confused and complex. The idea is cast in
the well-known inscription from an art deco statue at the 1933 Chicago Century of
Progress exposition–”Science Finds–Industry Applies–Man Conforms”. We are
perhaps more generally enlightened now when we consider the idea of a ‘technical
fix’ for social problems such as poverty alleviation, or even for problems of
education and entertainment, are often challenged by competing social and economic
interests or lack of knowledge.
“Science is driven by uncertainty. It is an unending quest to explore and
explain what was previously unknown. But science is not very good at
sharing its uncertainties. Scientists justifiably regard uncertainty as a
technical matter, to be rationalised and tackled. Scientists are wary of
washing their dirty laundry in public, so the public image of science is often
far more certain. From the outside, science is seen to deal in facts. The public
acknowledgement of uncertainty is the first step towards collective
experimentation. More and more institutions that use science now know that,
if they want to make good decisions and keep the public onside, they must
look for what they don’t know. Collective experimentation invites wider
discussion of bits of science that are often kept behind closed doors, which
makes it messy. Society becomes the laboratory.”

Both industrial research and university-based science practice are marked by


uncertainty of process and outcome, but what are defining are the degrees of
toleration of uncertainty. They are far greater in the academy than in industry, where
corporate headquarters must commit large resources to projects with outcomes that
cannot be known in advance. There must be return on investment, or at least some

026463 244
return on investment (ROI).

Indeed the very existence of the linear model of innovation diffusion has been
disputed (again this is another area where David Egerton has become embroiled).
However, it is clear that at least polemically the model has served as a justification to
initiate new funding of basic research. This justification is enshrined in a promise
that basic research would, in some sense, result in future technological advance. This
has been accused as a self-serving rationalization designed to garner public (or, in the
case of industrial research labs, corporate) funding for work that had no necessary
economic or ‘managerially-relevant’ benefit.

Benoît Godin and Joseph Lane have noted that at the turn of the century there was
hardly any mention of research within the firm, and that indeed a split between basic
research and applied research took shape “research is an academic’s category,
development is an industrial category.” (Godin and Lane, 2011, p.5) 238 Indeed the
first book on the industrial organisation of research was The Organisation of
Industrial Scientific Research by C.E. Kenneth Mees, manager and director of the
laboratories at Eastman Kodak from 1912 to 1955. The book was only published in
1920 and revised in 1950. The book served as an example in Stephen Shapin’s
argument is that science, including pure science, had firmly found a home for itself
outside of the academic world by the twentieth century. He further asserts that it is an
assumption that academic and industrial goals and values were in conflict. He bases
this observation on the rhetoric around this, with its claims of virtue and worries in
academic research over purity and the lack of it in industrial research. Shapin to an
extent, follows Mees in his suggestion that ‘real science is industrial science’ – but
only if it incorporates the traditional strengths of “pure” science in collaborative
effort. The main scientific per drawn out by Shapin are familiar: the academic
scientist, whom focuses on fundamental research; the industrial scientist, more
oriented toward putting knowledge into use; and the more recent entrepreneurial
scientist, creating intellectual property for sale into a commercial development

238
Godin, B., Lane, J. and SUNY (2011) Research or development? A short history of research and
development as categories. Online paper.
https://sphhp.buffalo.edu/content/dam/sphhp/cat/kt4tt/pdf/research-or-development.pdf

026463 245
process, including faculty-based start-up companies.

Ever since the system of universities developed in medieval Europe, the two main
roles of universities were widely recognized to be “To codify knowledge” and “To
teach the codified knowledge.” Universities over the years faithfully performed those
two roles. Gibbons et al. (1994) note that after the Second World War, a new form of
knowledge production began emerging that was context-driven, problem-focused and
interdisciplinary. Society, particularly industry and the government, came to expect
universities to play a new role by creating knowledge that gave birth to economic
value and transferring it to industry. Of course universities had long been involved in
knowledge production and arguably the modern research university owes as much to
the Berlin model as it does to the industrial revolution, and the products of industrial
activity becoming more knowledge intensive. Gibbons et al. distinguish between
‘Mode 1’ and ‘Mode 2’ in the production of scientific knowledge. ‘Mode 1’ research
can be considered as disciplinarily organized, while ‘Mode 2’ research is mainly
legitimated and organized with reference to contexts of application. Limoges (1996:
pp.14-15) wrote that: "We now speak of 'context-driven' research, meaning 'research
carried out in a context of application, arising from the very work of problem solving
and not governed by the paradigms of traditional disciplines of knowledge."239

This new role is frequently called “the commercialization of knowledge.” The latter
was endorsed in the United States by legislation [the Bayh-Dole Act U.S. Law 96-
517, Patent and Trademark Act Amendments of 1980] which added a new role to
universities.240 In the UK you could bear witness to the rise of industrial liaison wings
and commercialisation departments.

After its creation in 1922, the BBC forms a Research Department in April 1930. As
Edgerton detailed, Vannevar Bush’s published report to the President, Science: The
Endless Frontier (1945), is often cited as an important expression of the linear model

239
Limoges, Camille (1996). L’université à la croisée des chemins : une mission à affirmer, une
gestion à réformer. Quebec: Actes du colloque ACFAS.CSE.CST, Gouvernement du Québec
Ministère de l'Éducation.
240
Yasuda, Satoko (2011) Exploring a Conceptual Framework for Academic Entrepreneurship :
Beyond Pasteur’s Quadrant International review of business, March, 11: pp. 25-40

026463 246
on account of its advocacy for federal funding for basic, university-based research on
the basis of its importance for further technological progress.

As with the ‘black boxing’ of ‘technology’ the problem of the linear model is one of
abbreviation, the use of such terms often mask or smooth the complexity of what is
an often rich and convoluted journey or phenomena. The linear model smoothed of
the burrs and contortions leads to a specious view of an unproblematic process of
invention and innovation. It is a comfortable and convenient way of mitigating or
masking complexity and context. It is akin to the generalisations when we say that as
persons, we are born, we go to school, we get a job, get married have children, retire
and die. There are many other common rhetorical devices we are exposed to on a
daily basis that are received as a given, and they could include, ‘jobs’, ‘the market’,
‘quality of life’, ‘consumer and user needs’. Raymond Williams in Culture and
Society highlighted the reflexive or circular relationship in modern times between
key terms of social analysis (such as culture, society, industry, and art) and the social
changes they were supposed to analyse. Williams shows that the meaning of such
words was to some degree generated by those changes: the words acquire meaning
from the very social changes they are intended to analyse. In addition, key concepts
tended to cluster, so that their meanings shifted in response to alterations in the
others. Even ‘innovation’ and ‘creativity’ start to veer towards joining this list. It is
only by exploring cases in depth that we can learn the salient issues which
characterise these terms as well as something of the nuances and minutiae of the
conditions and exigencies which form the context of decision making. This can
furnish practising designers, managers, strategists with the insights and perspectives
which will guide them in their decision-making. Compared to the early part of last
century there has been significant movement from an industry and government
disinterest in basic research to its practices being increasingly industrial in location,
corporate in organisation, and product and profit-minded in motivation.

Technology and everyday life

026463 247
Nothing in the world stands by itself. Every object is a link in an endless chain and is
thus connected with all the other links. Technology is an inevitable part of the
environment in which we live. It serves as a very real background (in terms of all
others in the social system using technology to manufacture and make things,
content, develop ideas, create buildings, control traffic, provide e-commerce etc. –
i.e. activity which we are not directly involved in), and sometimes foreground (in our
direct use of media devices and content which informs us regarding the world) upon
which we base our conceptions of the world. Just as socio-economics defines the
status of countries and the behaviour and interests of those living within them, as
does cultural mores and conventional wisdom, so does technology. It can be
explicitly used or unconsciously present, as in the use of a hammer to bang in nails,
or the use of power stations to drill in screws. It operates and exists across different
scales and scopes in terms of use.

(e.g. Clifford and Marcus, 1986; Van Maanen, 1988; Denzin, 1997) about
representation and interpretation: Qualitative methods illuminate both the
ordinary within the world of fabulous people and events and also the fabulous
elements of ordinary mundane lives. How we represent the truths we generate
remains an open question. The interpretive turn in the social sciences,
education, and allied health fields inspired a wide variety of creative forms of
representation of qualitative findings, including narratives, poetry, personal
essays, performances and mixed genre/ multi media texts as alternatives to
the hegemony of traditional social scientific research reporting strategies that
pervaded the academy (e.g. Denzin, 1997). (Ellingson, 2009: p.1)241

With respect to technology and innovation, discussion of the ‘impact’ of technology


on every sphere of life, suggests that ‘it’ somehow has a life and will and momentum
of its own has eclipsed considerations of other determinations i.e. cost of
components, robustness of components, interoperability of components, socio-
cultural perceptions of the technology, the organisation of R&D, production, the
input of publicity and marketing, to name but a few. All should be considered to exist
241
Denzin, Norman 1997 Interpretive Ethnography: Ethnographic practices for the 21st Century,
Thousand Oaks: Sage Publications
Van Maanen, John 1988 Tales of the Field: On writing ethnography, Chicago: University of Chicago
Press
Clifford, James and George Marcus 1986 Writing Culture: The poetics and politics of ethnography,
Berkeley: University of California Press
Ellingson, Laura 2009 Engaging Crystallization in Qualitative Research: An Introduction, Thousand
Oaks: Sage Publications

026463 248
in a kind of product ecosystem which has loops, subprojects, and new commercial
and functional relationships with partners, clients and others such as regulators and
standards committees etc. And problems can and do occur at any level in the social
and technical project that is innovation as it develops and unfolds. These can range in
influence from minor hiccups to catastrophic instances (such in a regulatory regime,
or a competitor gaining a licence for a much cheaper but key component).

The study here revisits the key questions arising in innovation by examining a major
innovative technology project in depth. The project demanded the birth and continual
reorganisation of a firm and its product, and the courageous and adventurous spirit
demanded when opening the prospect of extending the way in which people use a
familiar, pervasive domestic device – the television.

As such it is a major story of organisation as much as perception building, digital


entrepreneurship as much as bringing a new emerging ethos of organisational
adaption to work with motivations and drives to protect the firm’s commercial
interests and intellectual property. In particular it considers the responsiveness of
firms towards opportunity and adaptation of its structures and product. It also
considers the key notion of open source, and more porous and boundless means of
working in partnerships with other firms and organisations and developing
knowledge. This latter point is a key defining characteristics of the digital age, where
firms, not only in computing and internet, are engaging in more open, mutual,
transparent relationships with each other and with end-consumer users. The
technology in many ways acts as an inspiration and metaphor for human organisation
[i.e. open-source software and open-source business relationships, and interactive
feedback loops, and system-logging making for the demand of more ‘on-demand’
and targeted service provision]. And it considers the development of the technology
itself, which was an amalgam of digital transport devices, and media content and
services all of which had to be developed and understood, not only from a technical
or service position, but most importantly from those who would use and , ultimately,
pay for it.

026463 249
The protagonist in this story, Acorn Online Media (AOM) and its partners aimed to
make the common and garden television, and television related experience,
interactive, by making it digital and connecting it to digital networks and computing
technology. Television, regardless of its ubiquity and familiarity, nevertheless
represents a very sophisticated and complex technical subject, even within the very
broad and eclectic realms of science and technology studies. The hard technology,
the television set or receiver, straddles a myriad world of electromagnetic waves and
experimentation; wireless communication and technical development; and radio
broadcasting and commercialization, mechanics, electronics, broadcast equipments,
performance, narrative, story, facts and fictions, performance arts, graphic design,
products, props, people and deeds. The consumers, users or viewers of television are
those who switch on, select and tune into what is on offer and select from that which
is given to what they want or choose to watch. As such it draws to its study a diffuse
multidisciplinary approach drawing upon the social sciences, media, cultural and
communication studies, consumer research, and human-computer interaction studies.
Each plays their role in what was essential in an exercise in sense-making for not
only the company, under investigation but also the research process itself.

An ever widening of scope in design and innovation necessitates a more concise


definition of the concept of knowledge, which relates not only to its material, but
also to its experiential, cognitive, social and ethical dimensions. The accounts offered
here in the thematic discussion will move seamlessly across the social and technical,
the material and concrete and the experiential, the cognitive and emotive in an effort
to lay down the foundations of the phenomena. Television viewing, or come to that
the reading of text, as an activity, is something that many if not most modern subjects
engage in on a routine daily basis. As such their place is secured as a focus of
analysis and scholarly attention regardless if it migrates from the box in the middle
of the living room, to the mobile device or phone to be watched on the move or in a
tent pitched on a forlorn wind swept mountainside at 2 a.m. I am reminded of C.K
Chesterston The Club of Queer Trades Gilbert K. Chesterton The Tremendous
Adventures of Major Brown.

026463 250
"We believe that we are doing a noble work," said Northover warmly. "It has
continually struck us that there is no element in modern life that is more
lamentable than the fact that the modern man has to seek all artistic existence
in a sedentary state. If he wishes to float into fairyland, he reads a book; if he
wishes to dash into the thick of battle, he reads a book; if he wishes to soar
into heaven, he reads a book; if he wishes to slide down the banisters, he
reads a book. We give him these visions, but we give him exercise at the
same time, the necessity of leaping from wall to wall, of fighting strange
gentlemen, of running down long streets from pursuers - all healthy and
pleasant exercises. We give him a glimpse of that great morning world of
Robin Hood or the Knights Errant, when one great game was played under
the splendid sky. We give him back his childhood, that godlike time when we
can act stories, be our own heroes, and at the same instant dance and dream."

The physical and geographical context may change but the essential activity remains
the same – it is television ‘viewing’. However, there are other possible functionalities
and how much they will combine, converge, or contrast with television viewing is
most definitely worth noting in studies of interactive television.

What is additionally relevant about this study is that it was conducted at a time when
digitisation of communications and media had at least two major avenues of
development open. In the early 1990s the world experienced what could be described
as the digital turn which is marked by the increasing diffusion of computing into the
public arena. The decade began with virtual reality and culminated in 1993 with the
less grandiose technology of the graphical World Wide Web which was populating at
an exponential rate at this time by evermore servers, internet service providers
(ISPs), commercial users and public users. This was a time when boundaries were
only just being crossed, with previous distinctions and barriers between enterprises
were being surpassed. The dismantle of the Berlin Wall served as a strong metaphor
for the postmodern turn in criticism, the end of dualisms, the supplanting of reality
with its virtual correlate, and we can see the rise of ethnographic methods taking
down the walls of the usability laboratory in human computer interaction studies, and
the walls of private western homes in media studies. Ethnography was beginning to
lift the lid on the home and the domestic sphere as a subject of interest in sociology

026463 251
and anthropology, and other subject including, again, human computer interaction
studies, media and consumer research. After all with increased connectivity, the
private was suffusing evermore with the public and not only as matters of social
research but also in terms of homes connecting to a new kind of networks which
provided a backchannel. The telephone had already provided this, as did sewerage,
but the underlying ethos of other utilities and networks was aimed at provision of
water or electricity to consume and purchase.

The modern notion of a public-private relation reached full emergence only in the
seventeenth and eighteenth centuries. Michael McKeon treats this relation as a
crucial instance of the modern division of knowledge. He places the idea of the home
and domesticity within a range of contexts at varying social levels of existence. At
the most "public" extreme are political developments like the formation of civil
society over against the state, the rise of contractual thinking, and the devolution of
absolutism from monarch to individual subject. The middle range of experience takes
in the influence of Protestant and scientific thought, the printed publication of the
private, the conceptualization of virtual publics - society, public opinion, the market -
and the capitalization of production, the decline of the domestic economy, and the
increase in the sexual division of labour. The most "private" pole of experience
involves the privatization of marriage, the family, and the household, and the
complex entanglement of femininity, interiority, subjectivity, and sexuality.

However, with the rise of the private and the public entered the period of
industrialisation and the rise of mass media and mass manufacturing and mass
education there were fears, social panics emerging. Some of this was concerning
technology. This fear continues today, a direct extrapolation of the kind of fears
which had been manifest at least since the advent of the ‘Penny Dreadfulls’ in
Victorian popular literature. It was held that those reading such material would be
predisposed to recreating its macabre violence. For instance the idea of multimedia
and its extension, virtual reality and gaming, could have a negative and corrupting
effect on the behaviour of youth. The internet and its early contents were, regardless
of the increasingly heterogeneous demographic using it, still considered a

026463 252
predominantly male pursuit and internet whose appeal was for those with specialist
technical interest and prowess, including hard core pornography.

Also challenging the transmission of video was the technical capacity of the Internet
and modems to convey messages and media was limited at this time. ‘Dial-up’
connections over the ‘twisted pair’ (traditional wire based telephone cables) and
application software were simply not capable of providing video services in any
satisfactory manner. Certainly such as would be understood and expected by an
analogue broadcast television audience. Instead, it was viewed that cable television
networks coupled to a reconfigured PC, now called ‘a set top box’ (STB), would
provide the prospect of new services and new propensities for interacting with the
television set and its content programming. These would in effect be a producer-led
assortment of video-on-demand features – news programmes, movies and features,
online networked gaming, and online shopping.

Novelty emerges not out of random artistic accidents, or recombination of existing


logics or histories, but rather from the dynamic cross-wiring of various material and
functional intelligences, and so it was with interactive television. It clearly had
legacies functionally in existing products i.e. broadcast television, VCRs, and games
consoles, and operationally in terms of multimedia computing, computer networking
and associated technologies, but it was the merging of these technologies and
services which would fuel new and innovative solutions. The idea that innovation,
whether scientific, technological, or architectural, is a by-product of artistic chance
or a result of singular genius can no longer be sustained in the 21 st century. And so it
is here, rather as broadcast television emerged from competing systems, most notably
the BBC’s choice of either the Marconi EMI system or the Baird mechanical system,
interactive television would emerge from a variety of different and sometimes
competing technical platforms and methods.

It is entirely logical for a computer firm, who during the past twenty years had been
sampling from theoretical disciplines and having to ensure compelling uses and
media for their machines to break out and build up productive, emergent networks

026463 253
with groups from other industries and the media. The increasing power of electronic
circuitry in workstations, personal computers, and consumer electronics, in
conjunction with the decreasing cost of high-bandwidth and low-latency
communication, was creating a large momentum to develop sophisticated multimedia
applications as well as to provide new types of services to businesses and homes.

The technical challenge

The potential for the convergence of multiple services (e.g., TV, movie, and
telephone) was, as mentioned earlier the ‘golden goose’ for those who were already
interested in exploring this technology. Almost every day as digital technology and
networking and its uptake expanded, newspaper headlines announced new field trials
and potential mergers of corporations. The focus in multimedia shifted to the creation
of large-scale video servers and delivery infrastructure that would be capable of
delivering thousands of simultaneous high-quality video streams to homes and
businesses. However, the economics of the marketplace at the time ran counter to
these high expectations. For example, in movie-on-demand applications, the cost of
storage of a large video library and the cost of delivery. bandwidth for a two-hour
video per customer were found to be prohibitively expensive in comparison with
traditional movie rentals from high street outlets. In addition, the infrastructure for
delivering high-quality video was mostly unavailable, particularly to the home. This
issue is referred to as the last-mile bandwidth problem. Compounding this problem,
most available multimedia contents were created as movies. Video-on-demand
applications lacked the depth of content (complexity of presentation) and provided
only limited interactivity (e.g., VCR control operations) as compared to Web-based
applications. New forms of content have to be created to provide marketable services
to users (e.g., distance learning, travel, advertisement, electronic commerce).
Furthermore, sophisticated tools must be available for easy creation of such
multimedia contents.

It is additional relevant as this particular firm had achieved their prominence and
success through an existing partnership with the BBC, through the BBC Micro,

026463 254
which had enjoyed a wide diffusion into British schools. Microsoft, Apple and Intel
had long maintained interest a complementary and eclectic interests and technologies
and applications, including interactive television and smart homes. And
domesticating computing was very close to all of their business ambitions.

As Lewis Carol is quoted “If you don't know where you are going, any road will take
you there.” due to the complexity of the social, commercial and technical needs of
the proposed system, a trial of the system and its functionality and commercial
prowess was deemed a important test by prospective financiers. Moreover this test
would permit twofold learning experiences for participants providing further system,
linking technology to services. It would give them access to user feedback with
respect to their use and perception of the system, and it would allow them to prepare
their own products and develop expertise of using the system, themselves in respect
of their own business aims and objectives. It would, in effect, create a virtual and
technical plateau where multiple expertises, techniques, and materialities can interact
and evolve with unexpected consequences, provide a platform for joint problem-
solving and joint decision-making rather than consortium members simply taking
their assignments away to work on and bringing them back to be re-integrated.

The trial was a prime example of opportunity for synergies between companies
coming from different sectors to focus on an embedded business setting. It thereby
encouraged knowledge-intensive interorganizational cooperation at an unprecedented
level. Through the ambient exchange of technologies and tools, critical contexts for
technology focused innovation, members of new collectives may find themselves
suddenly more agile and resilient, not only through their expanded network, but also
through their own learning and transformation in the process – i.e. not only learning
to develop for the new platform [and all that entails, i.e. hiring new expertise, re-
organising departments, buying new equipment and so forth] but also how to work
together in a commercial world which comprises of many competing interests,
functional And managerial specialisations, different concepts of clients and
customers, differing conception of the market and customer base and so forth.

026463 255
In essence their innovation would do nothing short on radicalising the use and
production of certain aspects of the television experience, how it would be produced
and delivered, and how it would be used by end-users – people in homes.
Commercial television’s primary goal was selling potential audiences to advertisers,
not necessarily selling products to consumers. The new networked and interactive
capacities would affect television related industries, most prominently, advertising
but also retail and banking. Networked television would even radicalise the provision
and supply of products and services – areas now commonly associated with the
World Wide Web and Internet ecommerce, and online banking.

The firm in the study were originally spurned to action by a relatively serendipitous
request by agents of a large media conglomerate who were searching for advanced
technical solutions for the delivery of their services and media content. Conferences
and trade shows are networking events as much as platforms to showcase new
devices and services, and so it was here. It was already clear to the interested party
that media and its related industries were about to enter a period of intense change.
In short, they approached a stall at an industry exhibition, liked what they saw
although they misread or misinterpreted just what this was [Acorn were actually
showing presentation software for their RISC PC, not anything like interactive
television]. A few key members of staff wished to exploit this proposition as a
technical opportunity, and did so as not only providers of component part [i.e. the
STB] but as system integrators, providing the social glue, organisational structure
and prompting relations within the system. They took on a leadership role with a
view to getting other companies involved.

It was in fact a technical system opportunity, but at the same time, a media and
commercial opportunity. This took the original computer company from their
comfort zone dealing with other computer companies into the relatively unknown
realm of content and service provision, an area that they were far less familiar with.
What resulted from the attempt to exploit was a diverse range of influences which
came to shape the directions and orientations of the product (the STB) and the
system (the other devices contributing to the technical system, the interface, and the

026463 256
services and content material) these are the subjects of the present study. Depicted in
the study is a story that is not simply a tale of the design of an object made from
silicon, plastic and metal. Nor is it just the story of the corporate politics and funding
that allowed the project to proceed. It is also the story of sales forces and distribution
systems, of marketing strategies and product evangelists, of a confluence of social,
economic and technological circumstances that would allow it to thrive. But along
the way it is clear that the effect of user needs were completely mediated by the vast
potential of sales volume and expected sales growth based upon the ubiquity and
pervasiveness of television as a domestic product, and the shear dearth of people
using it. In this sense, it was about a new way of aggregating audiences and
customers, as the digitals realm represents a new way to mass markets.

A focus upon users

Much is made in the HCI literature regarding a ‘focus upon users’ and in the
literature on marketing as a ‘focus upon customers’. In fact it also has become
something of an industry mantra although human-computer interaction studies, and
its various offshoots, have been consolidated as a functional component of
interactive systems development. But just how far a firm can go beyond say,
allowing users to change their wallpaper on their desktops etc., is a matter of some
debate. A product which is too open, too malleable, loses its ‘shape’ and becomes
simply a commodity, an example here is the difference between oatmeal and an
oatmeal muffin, the raw material or ingredient, which is open [i.e. it can be made into
porridge or a biscuit], and has a large potential market, and a finished good which
has a reduced market [i.e. those who want to eat muffin with their coffee for
breakfast]. Personalisation will only make sense if the product has sufficient generic
qualities that enable it to generate economies of scale relevant to mass manufacturing
[i.e. the industrial production and marketing of muffins]. Also, a product or service
which is too malleable will be difficult to protect in terms of intellectual property.
Standard platforms with dedicated applications. 242
Economists have identified and
analysed a set of key technologies known as general-purpose technologies (GPTs)
242
Jagdish N. Sheth, Atul Parvatiyar, (2001) "The antecedents and consequences of integrated global
marketing", International Marketing Review, Vol. 18 Issuse: 1, pp.16 - 29

026463 257
(David, 1990; Bresnahan and Tratjenberg, 1995; Helpman, 1998; Freeman, 2001). 243
Sustained economic growth is strongly correlated with the economy-wide diffusion
of GPT such as the steam or internal combustion engines or the computer. Generic
technologies, such as communication, transportation and energy technologies
provide ‘appropriate conditions’ required for smooth functioning of industrial
production spread across different firms and industries. Marx (1976 [1867])
considered these technologies as essential for success of the industrial revolution in
England. As such it is only the icing on the cake, such as personalising ringtones on a
mobile phone does not entail a reworking of the product from the chipset up. The
emphasis will always be on ‘standard’.

While Schumpeter argued that entrepreneurs are driven by technological opportunity,


early studies indicated that increases in demand preceded increases in inventive
activity over the business cycle (Schmookler, 1962). Cohen’s (1995) definition of
technological opportunity is the degree of technical advance achieved, normalized by
the cost of achieving it. The three dimensions are technological significance,
technological performance and technical feasibility. “Technological significance”
measures how significant a contribution to technology the invention is expected to
be. Shane (2001) has also put forward three characteristics: importance, radicalness
and patent scope. “Technological performance” measures the degree to which an
invention works better than alternatives or fulfils a function not previously provided.
This concept involves an analysis of the degree to which a specified technical idea
solves a particular functional problem (Goldenberg et al. 2001). “Technical
feasibility” measures the likelihood that an invention is technically sound and
complete. An invention that is not technically sound is not likely to be of commercial
quality. Many proposed inventions may not be technically sound (at the time) or may
243
David, P.A. 1990. The Dynamo and the Computer: An Historical Perspective on the
Modern Productivity Paradox. AEA Papers and Proceedings 80 (2): 335-361.

Bresnahan, T., and M. Tratjenberg. 1995. General Purpose Technologies: ‘Engines of


Growth’. Journal of Econometrics 65: 83-108.

Helpman, E., ed. 1998. General Purpose Technologies and Economic Growth. Cambridge, MA:
MIT Press

Freeman, L. 1979. Centrality in Social Networks: Conceptual Clarification. Social


Networks 1: 215-239.

026463 258
lack some fundamental scientific knowledge that makes it impossible to
commercialize. Finally, “Technological uncertainty” measures the likelihood that
planned future R&D will resolve current outstanding technical issues. The
importance of this concept for commercialization is undisputed (Nelson, 1959).

According to Rogers (1983), market acceptance is influenced primarily by consumer


preferences such as “user needs.” Modes of production and consumption, of design
and use, ultimately must pertain to the connectivity, access, affordability, knowledge,
skills, taste, conservative and immediate needs and desires of users and consumers,
and most certainly at the times of apprehension and appropriation. Innovation in the
neoclassical economic model is an exogenous process – a black box, if you will, that
works its magic solely in response to price signals. In this sense, the neoclassical
model sees innovation as falling like “manna from heaven,” not something that can
be induced by proactive economic policies. But Gilpin (1975) writes: “Everything
that we know about technological innovation points to the fact that user or market
demand is the primary determinant of successful innovation. What is important is
what consumers or producers need or want rather than the availability of
technological options” (p. 65)244 Rogers (1983) claims that the five most important
perceptual attributes of innovations determining their market acceptance are:
“compatibility”, the degree to which an innovation is perceived as consistent with the
existing values, past experiences and needs of potential adopters; “complexity”, the
degree to which an innovation is perceived as difficult to understand and use;
“trialability”, the degree to which an innovation may be experimented with on a
limited basis; “observability”, the degree to which the benefits of using an innovation
are visible to others; and relative advantage “the degree to which an innovation is
perceived as being better than the idea it supersedes” (Rogers 1983)

Challenging the consensus of the primacy of needs I innovation processes and


incentives is Mowery and Rosenberg (1979) when they point out that user “need” is
not an accurate measure of demand and that it is too vague an idea to work with in
economics. “to conclude that it is demand that drives innovation, market demand
244
Gilpin, R. (1975): Technology, Economic Growth and International Competitiveness, study
prepared for the Subcommitte on Economic Growth of the Congressional Joint Economic
Committee, Washington D.C.: U. S. Government Printing Office.

026463 259
must clearly be distinguished from the potentially limitless set of human needs” (p.
140). The link between personal preferences, consumption, and the demand curve is
one of the most complex relations in economics. However, “markets neither exist a
priori to buyer-seller behaviour in any case, but they especially cannot exist prior to
the introduction of new, non-substitute products.”245 In fact, Nelson (1959) reports
that in the study of 710 inventors by Rossman (1931) for most inventors, inventing is
creative self-expression, the two most frequently mentioned motives being “love of
inventing” and “desire to improve” Goldenberg et al. (2001) examine innovation
characteristics “as such” and their impact on future market success. They argue and
find strong empirical support for that certain innovation-level technical attributes are
ex ante identifiable, generally observable and also predictive of future market
success. They find that when the technological opportunity is identified first (or
concurrently with market need) there is a higher probability that an innovation will
be financially successful.246

The more different a product is, the more it represents a departure from that which is
known and familiar, the more it lacks functional and aesthetic references and
analogues, and the less we would expect people to understand and make sense of it.
Such a proposition suggests that most needs are conservative. But technical advance
can have two effects, either lowering the cost of production or increasing product
quality. Inventions that are more likely to reach large acceptance in the marketplace
have a greater potential market and are, thus, more likely to be commercialized, a
function of aggregate demand characteristics and this typically puts a brake on firms
experimenting in realms too far from their core competence and existing product
lines. But the promise of capturing monopolistic rents from an innovation provides
the incentives for firms to take risk (Cohen 1995; Teece, 1982). The products of
emergent networks are non-linear and non-predictable, which is a risk, but one with
exponential rewards. Linear processes will always create products which are
calculable and applicable in particular industries, but they will never exceed their

245
Competing through Innovation in a Dynamic Selection Environment by Maureen McKelvey,
Associate Professor, Department of Industrial Dynamics, Chalmers University of Technology
Electronic paper for the DRUID 2001 Conference, June, Aalborg, Denmark.
246
Goldenberg, J., D. R. Lehmann and D. Mazursky (2001): “The Idea Itself and the Circumstances
of Its Emergence as Predictors of New Product Success,” Management Science, 47, 69-84.

026463 260
inputs. Non-linear, emergent processes create innovation and newness out of
proportion to their inputs, and often applications which lie outside the original
trajectory. In high-risk entrepreneurial endeavours where ‘radical’ innovations are
brought to or pushed into the market, there typically abounds a complex of strong
projections, stories, visions and contentions which envelope the product and accent
or downplay its features and benefits. Not only do these guide and hone publicity and
marketing claims, but this rhetoric guides the development of the product itself in
efforts that range from winning support for the project in-house from senior
management, to driving suppliers to upgrade and modify components, and even
design organisational structures and physical layouts of offices and workshops.
Early-mover advantage has many advantages beyond the prospect of controlling
nascent markets; it entails the development of learning curve advantages and
complementary skills and assets (Cohen. 1995).

But as in any designed, planned and manufactured endeavour, there is always


compromise, obstacles, hindrances, inadequacies, inappropriateness - noise in the
communication system linking production, consumption, design and use. Markets are
created by retail spaces populated by products with features and characteristics and
people who play out roles, they bring to life the mysterious aggregate abstract termed
a market. Houses become homes, become alive, by those that populate and use them.
Systems, like electric, water, internets and telephones rely on connection,
participation, consumption and use by populations who are connected and, moreover,
using. A home becomes is not only a terminal in such systems, but a sub-system of
function and use which is dependent upon which way its internal plumbing is
ordered, which devices are connected, why where and when?

But connection and connectivity is not enough. Simply possessing a phone and
connecting it to as network does not imply that calls will be received and made. No
social physics guaranteed. Just because a television is present and a signal available
does not entail that it will be switched on and watched. The motivations for use,
personal or social need also to be cultivated, they need to form a soft system of
needs, values and desires that include perceptions of benefits, uses and gratifications.

026463 261
In terms of television viewing and performance this can include other intangibles
such as curiosity, intrigue, humour, and affiliation. Karl Weick suggests we interpret
the world, each differently, through our own particular web of perspectives and
experiences, in other words we ‘construct’ it (Weick, 1995) 247. How we see, and that
which we see, what we apprehend is conditioned and filtered by a range of factors
which begin with what is made available to us in everyday life which includes
natural social interactions with others and objects and mediated interaction via
technologies. These provide presence and experience at the neurological level and
end with the hegemony of the more subtle, more implicit registering of dominant
views, educated truths, science facts, fashion fads and propagandas of the time. They
shape and develop logics through in individuals through enculturalisation, socio-
cultural filters, prejudices, distinctions, norms and beliefs. How can we build
something new when our options and units of construction – ideas, bricks, and
components - are constrained materially, in terms of cost and in terms of the scope
and scale of our thinking?

Some modern needs may appear fickle, being personal, immediate, individual,
relatively transitory and less essential, such as finding a novel birthday gift for a four
year old at 10 p.m., a need to load a new version of Windows on your laptop, or the
need for yet another mobile phone. Modern needs may appear diverse, complex and
information intensive as in comparing technical specifications between different
makes of the same device or different models of the same make. This calculus of
search and numbers indexing to performance weighs heavily in the mind of
consumers against decisions to purchase much simpler products such as the stone
axes of our ancestors, utensils devised to open cans, or to contain loose goods and
liquids.

Other needs are intrinsic to our survival; they are ancient, universal, generic and
occurring across all societies, cultures and economies over time such as a regular
supply, and the regular use of, water. They cannot be displaced. But the means for
providing and acquiring them can be, such as in water and sewerage systems and

247
Weick, K.E., (1995), Sensemaking in Organisations, Sage Publications, London

026463 262
distribution systems which supply us with bottled water. The needs for washing
clothes and cooking in the home have not been displaced, but prospects of using
electricity and washing machines, microwaves and refrigerators are available to those
who are proximate to their supply and can afford them. Human face to face
communication has been augmented by a range of media, information and
communication technologies.
Thus any entertainment, creative, digital, knowledge, or information 'age' coexists
and rests upon products of its predecessors - the agrarian and industrial 'ages'. Each
age superseded its predecessor in part by improving production through the
invention, incorporation and use of new technologies and new methods and
developments of social routines that accompany them, such as acquiring and
accommodating, learning, operating and using them.248

The fundamental impulse that sets and keeps the capitalist engine in motion
comes from the new consumers’ goods, the new methods of production or
transportation, the new markets, the new forms of industrial organization that
capitalist enterprise creates. (Schumpeter, 1947:p. 83)

Social, economic and technical innovation have driven a free market economy, with
resultant improvements in general standards of living, healthcare, education, and life
expectancy. It has enriched personal choice in those countries which have been most
active. Firms competing on price and quality is typically good for customers. At the
same time the development of industry has created unwanted or unforeseen emergent
consequences such as greenhouse gas, anti-biotic resistant strains of bacteria and so
forth. More ambivalent unintended changes also occur, such as changing fashions,
revisions in the way we live together, or alone. No technology or ways of seeing and
doing brought about these changes independently. Gordon Pask noted: “All
processes produce products and all products are produced by processes.” A classic
example of emergence in nature is a termite "cathedral" mound. This is the structure
produced by a termite colony produced as a kind of unintentional by-product of their

248
Indeed each age has given rise to technologies which aim improve yields, of its predecessor, i.e.
mass produced tractors for agricultural use made in industrial age factories, and now computer
controlled manufacturing systems improving the production of tractors, and the use of I.T. as a tool
for direct contribution to agricultural productivity. It is also used as an indirect tool for empowering
farmers to take informed and quality decisions which will have positive impact on the way agriculture
and allied activities are conducted.

026463 263
activity and interactivity. Rather like Adam Smith’s ‘invisible hand’, they have come
to exist largely in areas which are ungoverned, unplanned, complexes resulting out of
related, interoperating technologies and social contingencies, even unrelated co-
existing technologies and commercial and social practices. Emergent phenomena
give rise to serendipity, surprise, chance and risk. Some would say that unregulated,
open and even chaotic environments are necessary for spawning innovative and
creative practices – such as in Shumpeter’s idea of creative destruction, where the
process of creating new industrial orders does not go forward without eroding pre-
existing orders, in manners similar to what Thomas Kuhn termed paradigm shifts
when he referred to scientific revolutions.

Paradigm shifting refers to a self-consistent set of ideas and beliefs that act as a filter,
influencing how we, as a society, perceive and make sense of things. Coherency and
patter and sense-making are important aspects of human functioning. It can be
expressed simply as ‘seeing what you believe’, rather than the usual ‘believing what
you see’. However, when enough significant anomalies have accrued against a
current paradigm, the scientific discipline is thrown into a state of crisis. During this
crisis, new ideas, perhaps ones previously discarded, are tried. 249
Kuhn noted that the
difference between scientific developments, possessing plateaus where ideas are
considered, or even remain as ‘facts’ until they are upturned by new data and
findings, and the humanities where ideas are much more mutable, and can be
continuously challenged and changed. These notions underpin the idea of
development and change, in both technologies and sensitise. In the world of design
and production, firms and institutions, like science facts, tend to persist over time,
until they are usurped, brought into question by new findings or put out of business.
Sustainability is increasingly understood as dependent on abilities to re-invent their
selves organisationally or update their product and service offerings. This Darwinian
adaptability is not a foregone conclusion however; it is not just about ingredients,
heuristics, algorithms and rules of thumb. As the present study shows an individual
or firm is in a complex selection environment where many choices abound, came to
abound, and choice-making will always be a constrained activity where there is

249
Kuhn T.S., (1962), The Structure of Scientific Revolutions, University of Chicago Press, Chicago

026463 264
imperfect knowledge (the blind leading the blind) and wrong options and strategies
can be taken. These can upturn strategic decisions made at an earlier phase with the
result of making directions managerially unwieldy. Opportunism has inherent limits.
This was the case as presented here.

Other new and also pre-existing technologies, societal positions, institutions, roles
and practices provide unplanned technical and social context and relevancies. They
relate not only to each other, but help to infuse and strategically direct new
innovation, crystallise infrastructures, codify and enforce standards, become implicit
in benchmarks and finally enshrined in business reports of ‘best practice’. Both
social groups who wield authority and power and technologies and manifest cultural
products can exist as the plateaux we call ‘institutions’. Television is an institution of
the home. Institution is defined in economics in the following way:
Institutions are here defined as man-made rules which constrain possibly
arbitrary and opportunistic behavior in human interaction. Institutions are
shared in a community and are always enforced by some sort of sanction.
Institutions without sanctions are useless. Only if sanctions apply will
institutions make the actions of individuals more predictable. Rules with
sanctions channel human actions in reasonably predictable paths, creating a
degree of order….The key function of institutions is to facilitate order: a
systematic, non-random and therefore comprehensible pattern of actions and
events. Kasper and Streit (1998:28 )250

Rules and codes define how television is used in the home. Like a clock
programming may pertain to a held notion of the demographic of the audience,
children’s programming in late afternoon to cater for children returning from school,
or adult programming being relegated to after 9 p.m. to mitigate its exposure to the
young. It becomes a kind of clock, regulating other domestic activities and ordering
and organising consumers (i.e. sales of toys in breaks in kid’s programme, prime
time advertising for mainline manufactured products like cars).

All systems and sub-systems of consumption and use are based on complexes of
behaviours and interactions which have boundaries of some sort and form some sort
of networked relation to other forms of activities, services and products. Railways
250
Kasper, W. and M. Streit (1998). Institutional Economics: Social Order and Public
Policy. Cheltenham, UK: Edward Elgar Publishing

026463 265
and cars led to road and rail networks, workshops and garages, engineers, mechanics
and fuelling stations, and to more people travelling further, the broadening of
markets. This led to a shift in attitude to geographical distance, and the development
of trade. Borrowing from biology, they create in essence a dynamic selection
environment, where engineers, designers, managers and end-consumers have
choices, sometimes few choices, sometimes many competing or conflicting choices,
that guide purposeful behaviour and logics. A new calculus formed regarding
moving goods and logistics – how much to carry my crops to this location, and will
they remain fresh etc. Human thinking and social functioning are essential aspects of
another (Resnick, Levine & Teasly, 1991) or as Stacey (2000) has it: “The social, in
human terms, is a highly sophisticated process of cooperative interaction between
people in the medium of symbols in order to undertake joint action.” 251 Many devices
are not only the application of new scientific ideas, but have been shown in social
studies of science and technology to be products of a vast range of disparate
influences including rhetoric and discourse in management meetings, standards
committees, town hall chambers, living rooms and chat rooms, available parts and
components, the regularity and dependency of their supply, finance, managerial
prowess, how they packaged, look and feel, and how they are sold and received by
competing social groups and so forth. However, "the debate about whether or not the
whole can be predicted from the properties of the parts misses the point. Wholes
produce unique combined effects, but many of these effects may be co-determined
by the context and the interactions between the whole and its environment(s)."
(Corning, 2002)252 Taking into account how desperate systemic and networked
elements will operate, interrelate and perform as a whole within a larger environment
is not only a complex undertaking, but for many firms essential aspects of their
strategic and business planning efforts. The aim is a continual improvement or
Kaizen as the Japanese manufacturers would term it.

Shifting focus to consumption we can also understand a system comprised of pre-


existing practices and products, needs, wants and desires, needs for consumption and
251
Stacey, R. D. (2000): The Emergence of Knowledge in Organizations. Emergence, vol. 2, no. 4, pp.
23-39
252
Corning, Peter A. (2002), "The Re-Emergence of "Emergence": A Venerable Concept in Search of
a Theory", Complexity 7 (6): 18–30

026463 266
function. Something more – the consumer-user correlate of Kaizen- continuous
improvement. This can be rooted in a desired end result or state or even just in
intrinsically motivated behaviour – getting unselfconsciously lost in the flow of
activity, such as watching a movie or playing a game (see for instance
Csikszentmihalyi, 1991).253 This aspiration drives action and motivates behaviours,
including appropriation, consumption and production of products and services. It
lays foundation to the means and ways in which firms and entrepreneurs have
explored satisfying needs at all levels, mainly through creating new and improved
configurations of available materials, and revisioning how people will, or could,
react, interact and transact with them. Such an idea has been at the fore of
information and communication technology development, as the very operating
nature of such technologies requires a proficiency in inputting and registering
information and data via interfaces of one sort or another. They recombine
components and elements - visual metaphors, graphics and keyboards and mice -
which influence how they are fabricated and packaged and sold as tangible and
intangible products. Designers, system builders and integrators are always limited by
what is available, their cost or practices that are established and engrained. Exposure
to these new configurations, develop new needs, behaviours, techniques, tastes and
desires arise not only in those who acquire, purchase and use, but also in those other
firms who invest, produce, design, make and manufacture. As Gilles Deleuze points
out, ‘the machine is always social before it is technical. There is always a social
machine which selects or assigns the technical elements used. 254A recent and obvious
example of this is automated call services, such as phone banking. One is requested
to punch in their account number, actions that were previously performed by a clerk
in a bank. There is a shifting of the labour and time onto the customer-user, with cost
savings for the bank that have invested in this technology and reduced the need for
human clerks. Another strategy is to use offshore call centres where low-waged staff
253
Csikszentmihalyi M. (1991). Flow: The psychology of optimal experience. New York:
Harper & Row, Publishers any activity in daily life can produce a flow
like experience. Also, it showed that activities like studying and schoolwork were
conducive to flow the same as typical leisure activities were. It was also shown that
television viewing was the activity that produced the greatest amount of apathy in an
individual (Csikszentmihalyi 1988). Csikszentmihalyi, I. and Csikszentmihalyi M. (1988). Optimal
experience: Psychological studies of flow in consciousness. New York: Cambridge University
Press.
254
Gilles Deleuze and Claire Parnet, Dialogues (London, 1977), pp. 126–7.

026463 267
can deal with common requests. Another strategy is to maintain the traditional idea
branch offices, but pair them down, opening the prospect of remaining closer to
customers, their particular needs and concerns, which the automated services cannot
do with the prospect of frustrating and alienating customers. Thus firms compete
through these activities and variations by offering cheaper and more tailored
products.

The implication of this viewed as a system comprising of the satiating of ever


emerging needs and requirements with ever new product and service solutions, is
pivotal politically to the modernist agenda of ever increasing progress, growth, social
and technical development. However, Joseph Schumpeter argued that:

“…in capitalist reality as distinguished from its textbook picture [what


counts is] the competition from the new commodity, the new
technology, the new source of supply, the new type of organization
(the largest-scale unit of control for instance)-competition which
commands a decisive cost or quality advantage and which strikes not at
the margins of the profits and the outputs of the existing firms but at
their foundations and their very lives….It is hardly necessary to point
out that competition of the kind we now have in mind acts not only
when in being but also when it is merely an ever-present threat. It
disciplines before it attacks. (Schumpeter, 1947:84-85)

The threat of competition and failure drives and motivates innovation by firms and
can even drive firms to work together and try out new things and ways of doing
collaboratively. Shumpeter defined different kinds of innovations:

1. The introduction of new goods.


2. New methods of production.
3. The opening of new markets.
4. The conquest of new sources of supply.
5. And; the carrying out of a new organization of any industry.

This idea has extended to ideas of ‘co-creation’, ‘co-invention’ and other forms of
collaborative efforts between producer and users, but always in the commercial
interests of producers, but not always directly. At the ideation stage, an ability to gain
insight into customer needs and an understanding of the potential relevance of

026463 268
emerging technologies.

–At the product development stage, an ability to engage actively with


customers to prove the validity of concepts and to assess market potential and
risks, and the ability to leverage existing product platforms into new products.

–At the commercialization stage, an ability to work with pilot users to roll out
products carefully but quickly, and to coordinate across the entire
organization for an effective launch.255

Innovation, according to Regis Cabral, for a particular network is a new element


introduced in the network which changes, even if momentarily, the costs of
transactions between at least two actors, elements or nodes, in the network. 256 It
culminates in the idea of ‘open source’ where generic, rough or even unfinished
versions of something are released to the public domain for appropriation,
involvement and improvement by others. These create a focus, a reason, for the
formation of a network of individual actors and activities to form, culminating in a
group, or rather grouped, effort. There may be rhetoric supporting coordinating
development, such as user group meetings and newsgroups.

Our man-made constructed environment, anything from clothes, wallpaper,


newspaper stories, road signs, parks, television, mobile phones; windows, doors,
walls and ceilings all create ‘systems of objects’ that have a direct and subtle impact
upon us through constant every day transformation arising from use and reuse. Home
improvement and D.I.Y. relies on people’s desires to constantly update and improve
their homes. The provided and manufactured reality is a collectively designed project
to some degree, as those who build and design are also consumer-users, but each
element incorporates its own vastly different and sometimes conflicting logics. These
due to some extend on extraneous influences which will largely form the substance
of this study, but sometimes they are simply a matter of taste. But suffice to say here
is that it is the purposeful effort arising in the human condition, at both individual
and group levels, that drives people to create new novel combinations and

255
http://www.forbes.com/2011/04/04/10-top-innovative-companies-apple-google-leadership-
managing-how.html
256
Cabral, R. (1998) "Refining the Cabral-Dahab Science Park Management Paradigm", Int. J.
Technology Management, Vol. 16, pp. 813-818.

026463 269
configurations of pre-existing conditions. Today people even consider their own
bodies as projects to update and improve.

“The more the world of work confronts us as hostile, exhausting and


miserable, the more people pour their energies into their lives outside work.
As the system develops new markets are constantly being carved out of our
needs and wishes. For example, consider the multimillion pound industries
which have developed around commodities which are said to make us look
thin or young, our desire to play games, to experience nature or enjoy art. The
very fact that we have the 'leisure industry' and the 'entertainment industry'
points to the fact that the separation of work from leisure has left a void in
our free hours: 'Thus filling time away from the job also becomes dependent
upon the market, which develops to an enormous degree those passive
amusements, entertainments, and spectacles that suit the restricted
circumstances of the city and are offered as substitutes for life itself'… The
retreat into the privatised world of the individual and the family is a
pronounced feature of life in the 1990s. Adopting particular lifestyles seems
to offer the only real chance of personal fulfilment. Hence the increasing
fascination for TV programmes and magazines about fashion, cooking,
holidays and gardening and the boom in the Do-It-Yourself market. The
family and the home have become leisure activities in and of themselves;
they have also become subject to the priorities of the market. All the
commodities which could increase our free time simply reinforce the family
as a unit of consumption not an emotional haven: 'As the advances of modern
household and service industries lighten the family labour, they increase the
futility of family life; as they remove the burden of personal relations, they
strip away its affections; as they create an intricate social life, they rob it of
every vestige of community and leave in its place the cash nexus.”257

Large organizations and wide scale beliefs and attitudes come to be assembled from
many smaller ones that keep some level of autonomy within the large – a person
within a home, a home within a neighbourhood, a neighbourhood within a city and
so forth. As Kelly suggests in The Networked Economy “Networks, too, need to be
grown, rather than installed. They need to accumulate over time. To grow a large
network, one needs to start with a small network that works, then add more
sophisticated nodes and levels to it. Every successful large system was once a
successful small system.” The Finish architect Eliel Saarinen suggests such a view
should purposefully inform design: “Always design a thing by considering it in its
next larger context – a chair in a room, a room in a house, a house in an environment,
257
H Braverman, Labour and Monopoly Capitalism (Monthly Review Press, 1974)
Cox, J. (1998). An introduction to marx’s theory of alienation. International Socialism
Issue 79.

026463 270
an environment in a city plan.”258 The scale of personalisation of online content, news
and services could also be scaled in such a way, individual within a home, the home
generally, the street, town or city, county, country or region.

“A designer developing any sort of user interface must also consider the
entire scene: How does a tool get used? Where and when is it used? What is
the context for use? What are the obstacles? A designer of two-dimensional
printed messages, or even of large-scale sign systems, must consider context
too. How fast does a driver move past a traffic sign, for example? At what
speed can the sign still be legible? What do pedestrians see differently from
drivers? All these factors play into how the “story” is framed and delivered.
The story is always being served, even if it is as simple as “this way to the
nearest off-ramp.” In simple terms, a story is a stand-in or substitute for an
event itself. Surrounding any story are the metaphors, tropes, and stylistic
devices that make a story more compelling, more understandable, or more
contextually relevant to the listener or reader … So, for example, the story of
let’s say a postage stamp includes not only the design or image on its surface
along with its currency designation, it also includes the paper it’s printed on,
the people who did the printing, the glue, the person who licked the stamp,
the envelope it ends up on, the letter in the envelope, and the mail slot it
might pass through. Follow this thread and it can be endless. And yet it is
exactly this trajectory—of production, use, and distribution—that makes up
the whole story of the postage stamp. These are the “stories,” with all the
potential relationships that occur between each stage, that the designer must
consider before setting out to shape experience.” (Hitchcock, 2013, p.172-
173)259

It is because of this organic, evolutionary form of growth, growth within complex


dynamic varied environments of established systems and emerging components and
social systems that make technology transfer enabled a challenging and diffuse
project. As Shumpeter (1947; p. 82) suggests “The essential point to grasp is that in
dealing with capitalism we are dealing with an evolutionary process,” This plagues
neo-liberal efforts to accelerate the development of less technologically and
economically developed countries. It accents flawed logics and heuristics that may
have relevance in the origin society. Kelly cites as an example the ‘installation’ of
capitalism in post Soviet Russia “After the collapse of the Soviet Union, Russia tried
to install capitalism, but this complex system couldn't be installed; it had to be

258
Eliel Saarinen, quoted in Time magazine, July 2d 1956
259
Graphic Design, Storytelling, and the Making of Meaning Lucinda Hitchcock in The Art of Critical
Making

026463 271
grown.” Nation states and their governance, nuclear disasters, plane crashes, bridge
failures are prime examples of how there is no perfect way to govern a system. Even
a water pump, considered only as a relatively simple, discrete technology and not as
part of an overall technical and social phenomena can lead to failure. “Progammes to
encourage transfer of technology from industrial nations to ‘less developed’ have
often been frustrated because they have not allowed for responsive invention in the
countries concerned They Have in effect been imposed. (Pacey, 1991, p.viii).260

At a macro level those whom are charged to govern politicians, legal proponents and
regulators also develop laws and codes of commercial and personal conduct which
support development for all. It is not a process which can be entirely realised as
limited information may be available to different actors at any given time, and cannot
pertain to the interests, values and beliefs of every single atomised person. It can
only apply to a statistical mean of interests and values. Where those in power do not
reflect the needs and interest of this majority, then mass discontent arises driving a
wedge between those who govern and those who are governed – civil conflict, wars,
public displays and demonstrations. Such distance can also exist in firms, and in
design processes, where manufacturers design according to their own closed
perspectives of what is good for the customer, and what will sell. Where there is little
competition, and the product is deemed useful, this may work as in Henry Ford’s
famous maxim –“any colour they want as long as it’s black.” However in a
competitive market the dominant attitude over customers comes under threat from
those who have taken time to understand the product offering, and the aspirations,
beliefs and hopes of those who will consumer and use. This means that firms and
consumers always face risk and uncertainty in the satisfying and catering of needs,
but ultimately they make anyway, make decisions, follow routines, and attempt to
embed new options within these routines, implement strategies within imperfect
knowledge, and do so as best they can.
A new era of production has begun. Its principles of organization are as different
from those of the industrial era as those of the industrial era were different from
the agricultural. The cybernation revolution has been brought about by the
combination of the computer and the automated self-regulating machine. This
results in a system of almost unlimited productive capacity which requires
260
(1991)Technology in world civilization: a thousand-year history

026463 272
progressively less human labor. Cybernation is already reorganizing the
economic and social system to meet its own needs.261

In one respect the committee's warning was prophetic. Computers now replace
humans in carrying out an ever widening range of tasks - filing, bookkeeping,
mortgage underwriting, taking book orders, installing windshields on automobile
bodies--the list becomes longer each year. And beyond directly replacing humans,
computers have become the infrastructure of the global economy, helping jobs move
quickly to sources of cheap labour.

But the Ad Hoc Committee also made a major miscalculation. Like many computer
scientists of that time, the committee expected computers would soon replicate all the
modes by which humans process information. The expectation was only partly
fulfilled, and so the committee's warning was only partly right. Computers have not
created mass unemployment, but they have created a major upheaval in the nature of
human work.

However, the ruthlessly generic nature of the basic needs across societies and
cultures over time prioritise them on a policy level. However, they now sit alongside
other needs which have been engendered through technical, economic and social
development. These include the need for a constant stream of inexpensive, affordable
produce, tools and products that cater for homes in highly structured modern,
urbanised societies. And for entertainment lending release and respite from
engagement in purely productive labour. Karl Marx spoke widely in the 19 th century
on how mechanisation and later automation of agriculture and manufacturing
processes, where natural, raw and synthetic materials transform into varieties of
convenience, came to being through lending humans vast control over nature. Early
societies exploited the cyclic seasonal sources of food, including fish, grain, fruits
and game and so began to notice regularities and patterns. 262 But pre-agrarian hunter
gatherer diets were at the mercy of nature, restricted by region, season, available

261
ON MARCH 22, 1964, THE AD HOC COMMITTEE ON THE TRIPLE Revolution sent a
fourteen-page memorandum to President Lyndon Johnson. T
262
Diamond, Jared (1999). Guns, Germs, and Steel: The Fates of Human Societies. New York: Norton
Press.

026463 273
local plant and animal resources and some degree of pastoralism and hunting.

Humans are incorrigible problem solvers, pattern seekers and pattern makers, and
certainly when they seek to satisfy needs. Where none exist they will invent them.
““Creativity occurs when a person, using the symbols of a given domain such as
music, engineering, business, or mathematics, has a new idea or sees a new pattern,
and when this novelty is selected by the appropriate field for inclusion into the
relevant domain.” (Czicentmilhayi ibid.: 28) Learning demands the picking out and
recognition of patterns like words and letters, and also practicing behaviours. And
humans can be impassioned story tellers; this is the means through which
psychological possessions also can be passed and shared. Language, and moreover,
stories organise thoughts and thoughts organises language. The dynamics that arise
as people tell stories, to themselves as well as to others, and then enact those stories,
create the dynamic human systems – families and neighborhoods; workgroups,
organizations, and economies. Pondy insists that organizational theorists are locked
into analysis based on the lower levels of complexity. Given the then-current
understanding of organizations, analysts should think of them less as “input-output”
machines and more as “language-using, sensemaking cultures.” What is needed, as a
result, is “radical methodological departures [such as] ethnographic techniques more
suitable for studying meaning and belief systems.” Listening to stories and
commentaries not only transfers a message, but also suggests on implicit levels at
least, the structure, shape and generic characteristics of a story itself and how one can
or should be told Joseph Campbell.

In a community such as ancient Jericho, people built and rebuilt their mud brick and
stone huts which contrasted with the preoccupations of their nomadic ancestors. To
begin with we can imagine individuals struggling to find the best way, the easiest
way, the most robust way, the most economical way to build a house or make an
oven, making mistakes which over time, and through much discussion and reflection,
design and labour problems were rectified. In essence they the role of the ‘bricoleur’
of Lévi-Strauss - one who puts to use instruments, materials and practices ‘not
known as a result of their usefulness; they are deemed to be useful or interesting

026463 274
because they are first of all known’ (Lévi-Strauss 1967: 9). Lévi-Strauss contrasted
‘bricoleur’ with the attitudes and behaviour of the ‘engineer’:

The “bricoleur” is... someone who works with his hands and uses devious
means compared to those of the craftsman...The bricoleur is adept at
performing a large number of diverse tasks... His universe of instruments is
closed and the rules of his game are always to make do with“ whatever is at
hand,” that is to say with a set of tools and materials which is always finite
and is also heterogeneous because what it contains bears no relation to the
current project or indeed to any particular project, but is the contingent result
of all the occasions there have been to renew or enrich the stock or to
maintain it with the remains of previous constructions or destructions. The
set of the bricoleur’s means... is to be defined only by its potential
use..because the elements are collected or retained on the principle that “they
may always come in handy.” Such elements are specialized up to a point,
sufficiently for the bricoleur not to need the equipment and knowledge of all
trades and professions, but not enough for each of them to have only one
definite and determinate use. They represent a set of actual and possible
relations...(Ibid.: 16–18).

At the beginning when a technology solution is new and unknown, people are more
like ‘bricoleur’, only with material and structural elements settled, their propensities
consolidated can the ‘engineer’ perform his job. This is rather like the case presented
here where ‘off the shelf’ components were modified, incorporated and used to build
the system, some requiring more effort than others.

The consequence in early human settlement building as a vernacular architecture


which can still be seen today in different styles of traditional homes across various
global locations - repeated building forms arising from the appropriation and use of
local materials and eventually to a community defined code or formulae, and other
consideration such as water management. The pivotal importance of core needs such
as water created interaction systems which conflated distinctions between audiences
and performers; users and designers; occupants and architects. Wenger has recently
described this behaviour as ‘communities of practice’, where individuals pursuing
enterprise always do so in interaction, so that the resulting practice “belongs to the
community” (Wenger, 1998: p.45), or that practice ‘connotes doing . . . in a social
and historical context that gives substance and meaning to what we do’ (ibid. p.47)
This what Turner views as an extension of Kantian philosophy, ‘where practice is an

026463 275
activity seeking a goal which is conceived as a result of following certain general
principles of procedure’ (Turner 1994: p. 8). Added to this form of social learning is
the organisation, structuring, and instituting of the home with its accompanying, yet
defining, divisions, contents, tools and habits.
“Habitation and habit come from one word - Latin habere, to have. We shape
our buildings around our routines, loving the fit when it becomes intimate and
sure, and cleaving to it as conservatively as a duchess in her sitting room.
Paradoxically, habit is both the product of learning and the escape from
learning. We learn in order not to learn. Habit is efficient; learning is messy
and wasteful. Learning that doesn’t produce habit is a waste of time. Habit
that does not resist learning is failing in its function of continuity and
efficiency...” (Brand, 1994; p.167)263

The development of constant and semi-permanent physical and psychical structures


arose with the development of settlement, the building of homes and cities, and the
emergence of distinctive cultures. The architect Christopher Alexander believes that
the discipline of architecture generates out of the oft repeated events of life which
recur in a particular place, that is, in a geographical location and within a given
culture. Culture has been defined archeologically as: “Patterned, learned, shared
behaviour based on symbolic communication,” (Webster, Evans, and Sanders: 1993:
p. 7).264 The many regularities of human behaviour, food preparation, grooming
rituals, bartering, trading, buying and selling, storytelling and so forth are often
repeated and transacted, transferred and transmitted and become the way human
society, people and social systems develop and function as a whole. Defining
characteristics of groups, cultures and societies is how people produce manifestly
similar behaviours among different members. Regularities and irregularities,
continuities and discontinuities form the stuff of stories and reportage, they become
individual and social knowledge, and become the way, once sampled and
understood, through which meso and macro decisions regarding the governance of
societies and commerce are made. Regularities also impact the development of
architectures and domestic technologies and practices therein. “The built
environment is an integral portion of the culture and becomes not only the physical

263
Brand: S. (1994) How Buildings Learn: what happens after they’re built New York: Viking
p.
264?
Webster, David L., Susan Toby Evans, and William T. (1993) Sanders. Out of the Past: An
Introduction to Archeology. Mountain View, CA: Mayfield Publishing Company

026463 276
stage on which the dramas of social interaction are enacted but also an integral
component of the drama itself, structuring culture while being structured by it.”
p.57265

The structure of the ‘everyday’ and the habits and routines which came characterise
everyday life is crucial to any notion of how domestic technologies can integrate and
be assimilated into the home and everyday life.

Igo's thesis is that modern surveying techniques helped constitute an American


"public," creating such ideas as "mainstream culture," "public opinion," and "normal
sexuality." She makes this case by examining three important surveyors of the 20th
century: Robert and Helen Merrell Lynd, whose study of Muncie, Id. came to
represent "typical" America; George Gallup and Elmo Roper's public opinion polls,
which were understood to represent the thoughts of an "average" American, and
Alfred Kinsey, whose surveys purported to uncover "normal" sexuality.

Igo shows how, in all three cases, conscious or inadvertent factors led to the
simplification of a messy reality. The Lynds, for example, excluded African
Americans and immigrants in their survey, so their "typical American community" in
fact represented only a white, native-born community. Even though these types of
studies did not describe the social world accurately, in time they helped constitute
that social world, as people began to identify themselves with the polling data.

It's a very well researched and careful argument. There are certain embedded
epistemological challenges implicit in her approach--for instance, to demonstrate
how Americans reacted to survey data, she must resort to non-survey data (letters,
public testimonies, etc). While this usually does the trick, occasionally you are left
wanting stronger evidence. For instance, she writes of how Kinsey's surveys on sex
helped normalize certain sexual behaviors (helped shape "normal" sexual selves, in
her words) but the evidence in the text is scant.

265
Kathryn A. Kamp, “Towards an Archaeology of Architecture: Clues from a Modern Syrian
Village,”
Journal of Anthropological Research 49, no. 4 (Winter 1993): 293

026463 277
Surveys were not new, but they had been focused on aggregate data (census), or on problems such as
deviance or marginal groups: ‘accident prevention, child welfare, truancy, and venereal disease as
well as Czechs, Greeks, Finns, and widows.’ (Igo p.29). What Middletown sought was not deviance
but typicality or ‘averageness’: to discover what it actually was. It avoided defining the issues to
study. It was organized according to a general, anthropological taxonomy of work, family, raising
children, leisure, religion and civic activities. Within that organization, it was possible for the issues
and essential dimensions of the town to reveal themselves – the urbanization and industrialization of
an agrarian culture, patriotism, a sharp social division into two classes, race, the efMiddletown was
widely read, and it transformed Americans’ pictures of themselves ‘telling Americans “who we are,”
“what we want,” and “what we believe.”’ (Igo p.3) Middletown is credited as instrumental to the
emergence of mass society, not just by what it found, but by social construction: creating a self-
fulfilling social belief in the existence of “averageness” as that mass society. It also permanently
altered the rhetoric of discourse on social issues to one in which popular beliefs could be trumped by
data. Middletown revealed averageness as a realm of inquiry that had been in plain sight, waiting to be
looked at. The earlier interests in problems such as deviance were not displaced but incorporated and
subsumed under more comprehensive categories of grounded social theory that developed into a new,
powerful technology supporting a broad range of social analysis and engineering for a broad variety of
purposes: a discipline.266

While early houses from the modern perspective may appear primitive and simple,
such as understood through the artefacts of dish, pot and spoon seen in any museum,
the important aspect is that humans interacted with them on a needs basis, a use basis
and a regular basis. They were also reflected upon. Schön’s The Reflective
Practitioner (1991) looks into the internal processes of the practitioner, inferring
what they might be from observation. Turner’s The Social Theory of Practices
(1994), also opens up several promising avenues of reflection by virtue of the
author’s dispute with the social theory approach to practice in general. We can see
not just functional needs reported by the museum exhibits, but clearly aesthetic
expressions as well. Elaborate jewellery for instance, could be interpreted as having
value to the wearer in terms of satisfying the need for social esteem and status and so
became part of an elaborate system of socially defined and defining codes and
266
Igo, Sarah (2007) The Averaged American: Surveys, citizens and the making
of a mass culture. Cambridge, Harvard University Press

026463 278
symbolic attributes. The use and appropriation of products, and intangibles such as
storytelling and other performances, can be understood as patterns of or patterned
activity and cognitions, which link to and developed cyclical needs for range of basic
needs and requirements such as nourishment and rest. In essence, they can viewed as
having hard and soft qualities, in so far as the artefacts themselves are pervasive and
lasting – that is hard - while the softer aspect which includes the practices of using
them, speaking about them, evoking strategies to making or acquiring them, valuing
them are soft.

Much contextualist studies of archaeology rely on the hard artefacts to intuit and
logically deduct the soft social aspects, as often no records may remain. This
epistemological reality permeates much of the work of the social sciences as it is
easier or indeed only possible at all, to only realise patters in behaviours and actions
once they have consolidated and passed. For Turner, even making explicit our own
ways of seeing and knowing is a dubious exploit: ‘Indeed, in general, we discover
our own assumptions, to the extent that we do, by standing on the stern of our boat
and watching for them in the wake, and finding that they become easier to identify
the farther they recede’ (Turner 1994: 32) Later in this study we will need to
consider, how in the sophisticated, perhaps cluttered world of media feeds and
commerce, how these cater for the same basic needs, how they need to be organised
in order to do this, and how they in turn seek to organise behaviours and promote
new forms of ritual.
A pattern is... “Something ‘in the world’—a unitary pattern of activity and
space, which repeats itself over and over again, in any given place, always
appearing each time in a slightly different manifestation. When we ask, now,
just where these patterns come from, and also where the variation comes
from, which allows each pattern to take on a slightly different form each time
that it occurs, we have been led to the idea that these patterns ‘in the world’
are created by us, because we have other, similar patterns in our minds from
which we imagine, conceive, create, build, and live these actual patterns in
the world,” Alexander, Timeless Way, 181.

Analysis of the home over history could also take a needs based view. Needs in the
lower echelons of Maslow’s (1954) hierarchy, that is, physiological needs for food,
water and protections from the elements – those aspects of everyday reality which

026463 279
characterise the plight of early humans - have come to be realised with ever more
higher level, social, abstract and more sophisticated needs, such as needs for security,
social distinction and credos, and today electronically mediated entertainment and
information, the desire to engage with services, games and other people via
information and communication technologies. This nascent reality of food and home
does contrasts with reality today where most basic needs are met with manufactured
and packaged goods, such as bottled water and pre-packaged fresh food, delivered to
homes directly via supermarkets - nodes in global distribution networks rationally
optimising price, maintaining freshness, opening diversity and giving constant on-
stream supply to the consumer. As we become further removed in both time and
space from the plants and animals whose lives make our lives possible, the respect
for those living beings becomes ever more abstract. For example, in order to supply
the fast food industry, a single farmer in Kentucky will raise 240,000 chickens a year
and net only five cents per bird.267 The home is also a node in networks supplying
water and electricity and communication, information and entertainment.

Mass demand can only be translated into mass production if there are means of
transporting goods from warehouses and factories to widespread markets.
Improvements in transport facilities – ships, roads, rivers, canal, the coasting trade,
and, latterly the railway and planes – networks of communication that enables that
demand to be both realised and satisfied (i.e. cars and trucks need the infrastructures
of roads and fuel stations to be successful). Also mass demand can only be translated
into mass production if needs and desires are cultivated at the consumer end by some
form of advertising, public announcements or some shift in shared social convention
or practice [i.e. word of mouth]. But this is problematic:
To explain how they get to the places they must get to – namely, inside some
people and not others – in order to do their explanatory job seems to require
an unusual process of transmission. If we conceive of practices as public
quasiobjects, they must get from their public location into persons who act in
accordance with them. If we conceive of them as dualistic forces, with
collective and individual aspects, we are faced with the problem of how they
can interact causally both on the collective and individual level. If we
conceive of practices as nothing more than habits, we are faced with the
question of how the same habits get into different people (Turner, pp. 60–61).

267
http://www.leafforlife.org/5ADAY/MECHANIZ.HTM

026463 280
Moving forward in history to the 18th and 19th centuries, the rise and improvements
on agricultural practice, science and technology powered manufacturing and the
significant development of agriculture, commerce and trade during this period, not
only opened up choices in the marketplace, but also impacted upon the everyday life
of people.
Before industrialization the family was the basic social unit. Most families
were rural, large, and self sustaining; they produced and processed almost
everything that was needed for their own support and for trading in the
marketplace, while at the same time performing a host of other functions
ranging from mutual protection to entertainment. In these preindustrial
families women (adult women, that is) had a lot to do, and their time was
almost entirely absorbed by household tasks. Under industrialization the
family is much less important. The household is no longer the focus of
production; production for the marketplace and production for sustenance
have been removed to other locations. (Cowan, 1976; p.1.)268

In Cuff’s models, Variable A - Gemeinschaft is Agrarian Society, namely:


affectivity, collectivity orientation, particularism, ascription, and diffuseness, and
Variable B – Gesellschaft is Industrial Society, namely: affectivity neutrality, self-
orientation, universalism, and achievement.269

Pre-industrialisation household and economy were overlapping institutions. Indeed,


the word economics comes from the Greek oeconomica, meaning the science or art
of managing a household. The traditional household brimmed with economic
activities, based on kin but extended to include living-in servants, apprentices and
lodgers, it was the scene of production as well as consumption and reproduction – it
was literally the pre-cursor of the factory. The increasing mechanisation of
agriculture bound to a concomitant decline in the need for manual labour saw
farmers and their families move to cities to seek work in factories, shops,
warehouses, docks and other professions. Urbanisation and rise of industrialism at
the same time brought people into built, technology organised and dominated
environments and distanced people from nature, and entire cities such as Manchester,

268
Cowan, R. S. The "industrial revolution" in the home: household technology and social change in
the 20th century. Technology and culture, v. 17, Jan. 1976: 1-23.
269
Cuff, E. C. dan G. C. F. Payne (edt.). Perspectives in Sociology. George Allen & Unwin Ltd.
1979.

026463 281
the world's first industrialised city, witnessed unprecedented and unplanned growth
in the early 19th century. Unplanned development of housing systems characterise
older hamlets, towns and villages, whilst these aspects are often removed in their
entirety in cities, making way for organised, structured designations of zones and
areas. This lack of planning led to the cramped and widely understood poor living
conditions of industrial era (Frangopulo, 1977). 270 The industrial revolution, its
infrastructures and support – i.e. housing and food supply and rubbish disposal -
were not enshrined in a master plan, but much more fluid and piecemeal, the agency
of many individuals and sizes of enterprise. Rather like how programming content
for television developed, the railway and telephone systems became integrated and
how the internet became populated and portioned into sites and servers.

The rise of the technical universities, scientific sociology and behaviourism aimed to
provide the theory and tools by which to understand and contribute to rapid shifts in
human activity and reliance upon technology as a driver of economic improvement.

Young describes the ideal as follows:

By city life I mean a form of social relations which I define as the being
together of strangers. In the city persons and groups interact within spaces
and institutions they all experience themselves as belonging to, but without
those interactions dissolving into unity or commonness. ibid. at 237. 271

The rise of scientific management and survey and surveillance mechanisms were
deemed important to the improvement, monitoring and control of production, and
beyond the factory, to understand the nature of the bristling urban populations. Major
redevelopment schemes for big cities were another factor in the rise of modern town
planning – rather than houses and homes organically developing around everyday
life and practice, they were being developed through rationalisation and the needs of
industry and commerce. Following somewhat in the footprints of René Descartes,
industrial age culture gave birth to mechanistic metaphors of progress and to the
discussion of persons, their behaviours, sociology and thinking. Being categorized,
270
Frangopulo, N. (1977). Tradition in Action. The historical evolution of the Greater Manchester County.
Wakefield: EP Publishing
271
YOUNG, I.M. JUSTICE AND THE POLITICS OF DIFFERENCE 53 (1990).

026463 282
numbered and defined is a form of subjugation and symbolizes a personal loss of
power within a social system. Patterns of behaviours were now designed top-down
by job specialisation and people viewing themselves and others as unit of
‘production’. People were considered ‘cogs in the machine’ in terms of their
function, and this idea reached its apotheosis in Taylor’s (1911) scientific
management thesis.272 Martha Banta The result was the subjugation of the individual
to the demands of the system. In Banta's view, the ideology of Taylorism moved
beyond the factory floor to "encompass every aspect of cultural existence." She sees
how Talorist thinking Banta shows how the cause of efficiency was taken up in
narratives of every sort--in mail-order catalogues, popular romances, newspaper
stories, and personal testimonials "from below," And as Charles Babbage imagined
computers as tools for imposing a God-like rational order on the microcosm of the
factory (Schaffer, 1994),273 Banta suggests how taloyrism gave rise to a “an extended
structure and narrative system.” 274
The commentaries and critiques offered by the
leading social theorists such as Durkheim, Weber and Habermass define the core
components of modernization in terms of a growing rationalization of the various
spheres of society, an increasing secularization which brought about the
disenchantment of reality, an irreversible development of bureaucratization, and a
growing pluralization of values and beliefs. Industrialisation also ushered in a radical
new sexual division of labour in Western society and thereby changed gender roles
and the definition of masculinity and femininity. Because capitalism is profit-based,
it demanded rationalization so that results could be calculated and so that efficiency
and effectiveness could be increased. In this way, rationalization became the
distinguishing characteristic of modern industrial societies. The rationalization of
society is the widespread acceptance of rules, efficiency, and practical results as the
right way to approach human affairs, and the construction of a mode of social
organization around this notion. According to Weber, rationalization has a dual face.
On the one hand, it has enabled a liberation of humanity from traditional constraints
and the has led to a progress of reason and freedom. But on the other hand, it has
272
Taylor, Frederick Winslow (1911), The Principles of Scientific Management, London,: Harper &
Brothers
273
Schaffer, S. (1994). Babbage’s intelligence: Calculating engines and the factory system. Critical
Inquiry, 21(1), 201–228.
274
Banta, M. (1993) Taylored Lives Narrative Productions in the Age of Taylor, Veblen, and Ford
Chicago: Univ. of Chigago Press

026463 283
produced a new oppression, the “iron cage” of modern bureaucratic organizational
forms that limits human potential. Industrialisation converted small-scale household
producers – cottage industries - into wage labour, necessitating a separation
of home and workplace, and the growing differentiation of gender roles. These all
created ‘distance’ from those who govern, produce and design, from those who are
governed, who buy, and use.

Women were to remain in the home, tending to the children and household chores,
while men became the principal breadwinners. 275 The urbanised home, suffered a
dramatic ‘loss of function’ compared to its former pre-industrial rural counterpart.
Needs that were formerly met by family members working within the home were
now met by outside agencies, and individuals interacted with the wider economy and
society not through their households but independently. Moreover, it signalled
transformation in the activities that existed between the household and the market
economy. The population was brought out of "rural self-sufficiency" and put in
regular contact with the market, increasing mass consumption helped to encourage
the emergence of mass production.

Broadcasting
The noun "broadcasting" derives from an agricultural term, meaning "scattering
seeds widely" Broadcasting as a form of communication is unique. While public
addresses one-to-many had been practiced since humans lived in groups, where a
leader or chief or their messenger would address the population, electronic media
required a format. Radio makers anxious to build sales of their products by ensuring
that there were radio broadcasts to which their radio-buying customers could listen.
Under Lord Reith in the 1920s there was development of the BBC, especially in
radio. As its first manager and Director-General, he promoted the philosophy of
public service broadcasting, firmly grounded in the moral benefits of education and
of uplifting entertainment, eschewing commercial influence and maintaining a
maximum of independence from political control. Early programmes included music
and weather reports. Until the development, popularisation, and domination of

275
Amott, T. and Matthaei, J. (1991) Race, Gender and Work, Boston, MA: South End.

026463 284
television, radio was the broadcast medium upon which people in the United
Kingdom relied. It "reached into every home in the land, and simultaneously united
the nation, an important factor during the Second World War." First hopes for the
Empire Service were low. The Director General, Sir John Reith (later Lord Reith)
said in the opening programme: "Don't expect too much in the early days; for some
time we shall transmit comparatively simple programmes, to give the best chance of
intelligible reception and provide evidence as to the type of material most suitable for
the service in each zone. The programmes will neither be very interesting nor very
good." This address was read out five times as it was broadcast live to different parts
of the world.276

Dissemination focuses on the message being relayed from one main source to one
large audience without the exchange of dialogue in between. There’s chance for the
message to be tweaked or corrupted once the main source releases it. There is really
no way to predetermine how the larger population or audience will absorb the
message. They can choose to listen, analyze, or simply ignore it. Dissemination in
communication is widely used in the world of broadcasting. Institutions and
technologies can or do only create more structures based on a design of
dissemination rather than dialogue.

Suffice to note that although we may be all ‘hardwired’ neurologically for survival,
in industrialised and post-industrialised nations basic needs for food, water and
protection from the elements have come to be satisfied by man-made systems; water
mains, electric mains, distribution chains, ample food and comfortable warm or cool
homes. Our needs for security are supported by the state, needs for knowledge, to be
part of the family, team or club, to appreciate arts, to value nature and the natural
(including all anxieties regarding global warming), and link to media consumption.
As a response to industrialisation and urbanisation the Romantic poets Wordsworth,
Keats, Byron, Shelly, and Coleridge, were influenced strongly by Rousseau's call to
return to nature and celebrate the worth of the individual. The poets praised the
restorative potential (clean air, fresh water, open spaces) of living a simple rural life.
276
Transcribed from audio source found at
[http://www.bbc.co.uk/worldservice/specials/1122_75_years/page2.shtml].

026463 285
This rationale may have later justified locating large spa hotels in the countryside.

What we view and read, what we listen to, what we absorb via media shapes
worldviews and defines needs and wants, and it can play a significant role in
reinforcing and even forming shared beliefs and values such as was the thesis of later
social theorists who were focused upon theories of ‘massed’ society. In order to
know that products exist we must be informed. This is the modern consumer society
with its heavy reliance on media and innuendo, [sometimes conflicting] information,
association and connotation. Excepting censorship rules and regulations, strictly
enforced in many countries, there are no limits to which human dilemmas and
behaviours can be dramatically depicted. Williams illustrates the way in which this
planned flow affects the viewing experience with his own first encounter with "the
characteristic American sequence":
I began watching a film and at first had some difficulty in adjusting to a much
greater frequency of commercial "breaks". [...] Two other films, which were
due to be shown on the same channel on other nights, began to be inserted as
trailers. A crime in San Francisco (the subject of the original film) began to
operate in an extraordinary counterpoint not only with the deodorant and
cereal commercials but with a romance in Paris and the eruption of a
prehistoric monster who laid waste to New York. (Williams, 1974; p.91-92)277

What Williams saw that night was not a distinct crime movie, but something
completely new; a surreal collage of cereals, romance, deodorants, monsters and
crime, simultaneously taking place in San Francisco, Paris and New York. Williams
remarked, “…the fact of flow” is “the central television experience” (p.95). We
make our own decisions on what is acceptable and accords with our taste and
preferences. We make need only consider the way we check our own morality by
vicariously watching the dilemmas of those others in the soap opera to see ourselves
on some level, agreeing or disagreeing with performers’ behaviours, actions and
beliefs, even if they are only glimpsed in passing through the room. With the
discovery of mirror neurons and similar systems in humans, neuroscience has shown
us that when we see the actions, sensations and emotions of others, we activate brain
regions as if we were doing similar actions, were touched in similar ways or made
277
Williams, p. 91/92

026463 286
similar facial expressions. In short, our brain mirrors the states of the people we
observe. Intuitively, we have the impression that while we mirror, we feel what is
going on in the person we observe. We empathize with him or her. Amongst a
plenitude of other possible scenarios a television show can move us to being creative
by watching the home makeover show, to exercise with an aerobics programme, to
invest with money tips, and of course we can learn just how well off – or how poorly
off - we are compared to the featured others and so forth. It shows us how to live and
what to buy to live that way (Meyrowitz, 1985)278

Television permits us to run the whole gamut of emotions; fear, amusement, hurt,
anger, guilt in one night, even in the span of one show. The room for comparisons
and compassion are enormous, the potential for contrasts and alternative existence,
infinite. All of life’s woes and joys, prospects and inevitabilities are portrayed,
discussed, and vicariously experienced – all within the comfort of one’s home. Mary
Douglas describes such a socially constructed reality as “…the admonitions, excuses
and moral judgements by which the people mutually coerce one another into
conformity” (Douglas, 1985)279. As are cultivations of taste, demand, wants, needs
and desires. As Postman warns us it blurs the realms of entertainment with serious
subjects with the danger of desensitisation to major issues.

In the knowledge age, an age partially fuelled by global media, brand awareness,
wars and telecommunications, the daily smorgasbord of messages received via the
act of watching television remains massive. Unlike, the cornucopia of products
which came to populate shops and department stores, media messages via television
and now the internet fill a person’s day. There are no physical boundaries in media or
cyberspace. Making sense of what we see via these media in our mind’s eye take up
very significant amounts of both time and attention. And we rely on a complex
system of signs and symbols processing, from simple language to help us construct a
mental shape and psychological value to that which we apprehend. Sitting and
viewing is often cast as a passive, non-productive act, a leisure ‘activity’ epitomised
278
Meyrowitz, Joshua (1985). No Sense of Place: The Impact of Electronic Media on Social
Behaviour. Oxford University Press
279
Douglas, M., (1985), “Introduction” in J.L. Gross & Steve R a y n e r, Measuring Culture,
Columbia University Press, New Yo r k

026463 287
by ‘couch-potato’ syndrome, obesity in adults and children capable of ruining health.
But we are forever sitting there, viewing, making decisions, actively affirming and
refuting, positively comparing and contrasting with what it is we are perceive in our
minds. Cultivation theory for instance in its most basic form, suggests that television
is responsible for shaping, or ‘cultivating’ viewers’ conceptions of social reality.280

The intimate and highly complicit relation with television and its contents provides
us with a mental checklist that addresses all levels of Maslow’s needs hierarchy.
Some programming content and behaviours on television are very abstract indeed,
they are phenomenological propositions that only the acts of watching film or
television generates. The fantastic visuals of the cartoon are an example. Through the
creative manipulation of the medium, such as in the application of special effects,
graphics or animations, the very style and manner in which something is shot carries
its own message or amplifies that of the plot or report, such as can be seen to demark
amateur from professional footage. Other examples here are the vicissitudes of live
television against recorded television, fact from fiction, concrete from abstract – all
offer proximations to the reality of their existence. The relationship between the
‘passive’ television ‘viewer’ has also been played with to the point that the viewer
may be defamiliarised or unsettled in the experience of television, by a few simple
production ‘tricks’. Sconce (2000) offers the example of an American science fiction
series The Outer Limits, which opened in a particular style, aimed as was the 1938
War of the Worlds broadcast, to heighten dramatic effect:

“Regardless of its shape or dimension, however, oblivion in The Outer Limits


was almost always mediated by some form of paranormal electronic
technology and centred most immediately on the American family, a scenario
that offered repeated parables about the audiences’ own relationship to the
TV set, and the set’s relationship, in turn, to a vast electronic nowhere.
Whether faced with new beings, mysterious powers, or strange technologies,
the characters in these stories (and the viewer at home) had to struggle against
uncanny and frequently electronic forces that threatened not only to kill them,
but to dissolve them into nothingness. … This assault on the viewer’s
autonomy began in the opening images of the program, which… used its
280
Hawkins R.P & Pingree, S. (1983). Televisions influence on social reality. In: Wartella, E.,
Whitney, D. & Windahl, S. (Eds.) Mass Communication Review Yearbook, Vol 5. Beverley Hills CA:
Sage.

026463 288
credit sequence remove the program from the terrain of “normal” television.
In a medium already renowned for its intrusive presence in the American
home, few television showed have featured opening credit sequences as
calculatingly invasive as that of The Outer Limits. A narratational entity
known as the “control voice” opened each week’s episode with those
unnerving words of assurance:

There is nothing wrong with your television set.

Their attention suddenly focussed on the set, viewers became the targets of an
increasingly ominous series of commands and assertions:

Do not attempt to adjust the picture.


We are controlling transmission.
We will control the horizontal.
We will control the vertical.
We can change the focus to a soft blur, or sharper it to crystal clarity….

For the next hour, sit quietly and we will control all that you see and hear.
You are about to participate in a great adventure, You are about to
experience the awe and mystery that reaches from the inner mind to The
Outer Limits. (p.136-137)

In a facsimile of the 1938, War of the Worlds panic broadcast, and against the
knowledge that during the cold war period in which this series was shown, that
networks interrupting programming like this could signal a 3 minute warning and the
end of life as we know it. Media devices, such as television and telephony, extend
and channel in particular ways the limitations, physics and rationalities of the ‘real’
world, and yet through their familiarity and ‘ordinariness’ they are a common part of
it.

The weaving and embedding of the television experience was the concern of Marshal
McLuhan, when he presented media technologies as extensions of nervous systems.
This was his effort to defamiliarise the insipid nature of media and its technologies,
their seamlessness with other forms of order and consciousness, our embedding of it
within everyday life and its conducts. Media devices provide feeds to the sensory
apparatus which extend what would be normally perceived under natural conditions,
as in the compensations and extensions of microscopes, telescopes and spectacles to
our efforts to see things. But like the prosthesis which eventually takes the place of
the lost limb, they are everyday, initially woven seductively and insidiously into the

026463 289
fabric of experience. Virtual reality is the apotheosis of this where computer
generated data is designed to envelope and feed synthesised data to respective senses
to create extended, augmented or alternative realities. And it should be noted that all
forms of programming and movie features and media do this to some extent anyway.

Gene Youngblood has discussed the history of experiments aimed at expanding upon
the experiential dimensions of the cinematic experience. Sound and vision have a
long legacy as creating enough stimuli to carry people away, engage and absorb them
in the plot, role, display, and performance. 281 We now discuss historical events, wars
and such like, in a familiar way as if we had been there, in a privileged role as
overseer. This blends with personal experiences. We feel we know many things
beyond that which we have experienced directly and now populate our minds and
recall, television helps us with this. There is always illusion in mediated images,
sounds, words, and writing. From the humble soap opera to the most sophisticated
science fiction production to the literature classic, to the witty, sharp and snappy
three-minute advertisement, what we see internally is quite alien to the normal
pedestrian course of perceptual and interpretative reality in everyday life. The
minutiae and granularity of everyday perceptions, the mundanity, the physics of
everyday life is missing. According to the tenets of psychoanalytic cinematic theory
the viewer is seen as the subject of a "gaze" that is largely "constructed" by the film
itself, where what is on screen becomes the object of that subject's desire.

The performance-based nature and narrative structure of many shows and features
keep the medium anchored to reality, but it also, at the same time, permits its
abstraction. A striking clue to what I allude to here has been captured last century by
artists seeking to explore the core phenomenology of performance, probably
beginning with the Dadaists and surrealists whose aim was to challenge conventional
wisdoms regarding not just the subject of art but the entire method of its performance
and presentation. They mixed or incorporated, what Arthur Koester termed
biassoicated contradictory features, characteristics, or attributes of objects and people
– such as Man Ray’s depiction of an iron – a domestic technology aimed at pressing

281
Youngblood, G, (1970) Expanded Cinema. New York: E P Dutton & Co.

026463 290
clothes - with a line of nails along the bottom rendering it utterly useless for its
intended function, to press and flatten clothes. This put into question its very
purpose. Other notables who raised questions regarding the formulae of culture
include the composer John Cage who in 1933 at the Metropolitan Opera House in
New York, infamously staged 3’33” of silence. The piece was also contradictory.
While the orchestra sat and lifted their instrument and played nothing. There was,
however, not a complete silence, as their never really is. There was the sound was the
couching, shuffling, sneezing, ruffling of programs, the extraneous sounds of an
audience in anticipation. Other examples are Andy Warhol’s ‘Sleep’ which is 8 hours
long and which rather faithfully tracks someone sleeping. This illustrates the
compression involved with the cinematic and televisual format, just as the works of
minimalists such as Terry Reily and Philip Glass in 1960s experimental music.
Antonin Artaud who was broke all a conventions of theatre, in his Theatre of the
Absurd where the barrier between performers and audience was removed, and
audience became performer and performer became audience. While these examples
derive from the realms of artistic endeavours they remain important ideas which
challenged the orthodoxies in the reception, acceptance, manner and ritual of
entertainment, cultural and media form and participation or interactive dimensions of
that which is authored, designed, or depicted.

Radio and television as means of entertainment or learning of the world is in every


way a radical departure to the manner they were previously delivered, certainly
compared to a washing machine in its relation to the act of washing clothes, or an
electric cooker as a substitute for a wood burning stove. So was radio and television
advertising or making products and the behaviours and lifestyles which producers
wish to associate with them. Within television programmes makers could now embed
products in naturalistic scenarios and scenes; making them, the products, look usual
and familiar as well as loved. But radio and television themselves were radical
innovations which needed a home, just as were the telegraph [which never en mass
made it commercially made into the home] and the telephone [which did], and the
attachment of a motor to a horse carriage [which also did, in fact commanding its
very own part of the home called ‘the garage’]. These devices demanded

026463 291
infrastructures, broadcasting networks, just as railways demanded lines and cars
demanded decent road surfaces, petrol and petrol stations. Media and performance
forms have evolved over time, just as the methods through which we engaged in
agricultural and manufacturing pursuits, but do they become more refined, more
efficient and productive? If agriculture was the control of Nature, and manufacturing
the control of raw materials taken from nature, some would argue that knowledge
and information are the control of people removed from nature and finding
themselves in urban spaces called cities.

But who exactly is doing the controlling? What exactly is being controlled? Media
forms, whether printed, broadcast or digital have been shaped by material and
technical innovation, socio-political forces and ideologies, socio-economic concerns
and socio-cultural influences. Corporations and political parties have long accepted
mass media as a means to affect social and political controls: “Control over the
means of informing is the basis for political power.” (Smythe, 1994: p.254) 282
One
need only consider the vast sums invested to produce and run televised
advertisements during ‘prime time’ to understand this. And this was certainly the
view of the sociologists from the Frankfurt school such as Adorno and others from
the “pessimistic’ Chicago school of mass media. Escaping from Nazi Germany
where Goebbels had identified that broadcasting and pop culture was equated as the
‘opium of the masses’, there are to this day committed scholars and learned persons
who claim that media is capable of manipulation and brain washing people. It is a
popular moral panic, although it is neither the first or last medium to be cast this
way. Technically mediated creativity, the act of viewing, listening to, or reading
certain works over others requires what Bourdieu would term less ‘cultural capital’
than the printed word.

Some critics speak of the demise of culture itself, of new illiteracies, of computer
mediated content being somehow less humanistic than the book. Many of these have
even claimed that new advances in media will bring with them more poignant

Smythe, D. (1994) ‘A Marxist theory of communication’ in Guback, T. (ed.)


282

Counterclockwise: Perspectives on Communication Boulder: University Of Colorado


Press

026463 292
pervasive effects, and most likely for the worst as far as people are concerned.
Striking amongst these is Marie Winn who in 1977 termed television as the ‘plug-in
drug’. She more recently revisited this theme in a re-issue of her book, and in this
revision she is even more hostile to the Internet and the World Wide Web than she
had ever been to television twenty-five years earlier.

Winn paraphrases Marx in a similar fashion that, ironically, Goebells also did when
they make reference to media absorbing or the critical thinking of the people:
"Religion is the opium of the people." The suggestion is that participation in religion
or television, or anything else that makes demands on rational thought, imaginative
capacities, attention and consciousness, which draws in and somehow captivates
attention has the power to orientate belief systems in ways which is not necessarily
beneficial to the viewer. In short by watching programmes they themselves are ‘re-
programmed’. While viewers are focused upon its obvious message, ethical
dilemmas, while they digest and translate its myriad codes and dissect its metaphors,
they are subliminally absorbed by its allure or spectacle, and then indirect or
unconscious processes are being manipulated. Their unconscious selves, their deeper
motives and drives are being manipulated in the sense of a hypnotic or drugged
subject, thus the ‘plug-in drugged’, the ‘addicted’ television viewers. In The Society
of the Spectacle, published in 1967, Guy Debord parallels the Marxist conception of
religion as opiate with that of wider sphere of marketing:

The spectacle is a permanent opium war which aims to make people identify
goods with commodities and satisfaction with survival that increases
according to its own laws. But if consumable survival is something which
must always increase, this is because it continues to contain privation. If there
is nothing beyond increasing survival, if there is no point where it might stop
growing, this is not because it is beyond privation, but because it is enriched
privation.

If one does not have the latest fashion as depicted in the advertisement then
consciously then one is choosing to forgo a certain status of lifestyle. If they choose,
however, to participate in this lifestyle then they join those others who are ‘part of
the club’, and a sense of social harmony that comes with social inclusion (part of
Maslow’s hierarchy). Television, with its exposure of celebrity and glamour, its

026463 293
exposure of inequality and deprivation, acts as a window for different social groups
to identify where they are at in a world of progress, accumulation, self and societal
development and modernity. It, and all other forms of branding, advertising,
entertainment and education work in concert to shape decisions and attitudes
regarding the big picture of who we are and what we want, where we are and want to
be, and even provide tips on how to get there. The attributes one associates with a
brand, a celebrity figure, a product or service will relate to how the producer wants
the viewer to perceive it. In essence, perhaps like religion before, and certainly
religious experience before literacy, it denotes existence and a sense of purpose
itself.

Just as earlier economic ages evolve their techniques and styles and modes of
production, the purposeful acts of 'going' to the cinema and ‘watching’ a movie, or
'listening' to the radio have not been supplanted by the advent of a new behavioural
category – that of television 'viewing'. Neither were they all as behaviours or
practices superseded by the arrival of video in the public domain, nor video games,
nor Internet-enabled PCs. All these activities persist to date as marketable ways in
which we develop knowledge of the world and ‘kill time’. The styles and modes of
their consumption have changed, such as cinemas during the 1980s converting to
multiplexes, but the core act of amusing oneself via devices has not, a whole range of
antiquated and largely forgotten ‘dead media’ precedes the advent of digital
interactive television, and as we will discuss and showcase later, some previous
attempts using different technologies and systems to make viewing interactive have
been tried, and have failed. Sometimes poor marketing, sometimes the limitations of
the experience, interface and/or device itself, sometimes lack of software, content or
programming, sometimes poor service – all contribute in part to failure and
commercial ‘death’.

In the April, 1928 edition of the Popular Mechanics Magazine (subtitled “Written
So You Can Understand It”) – the headline read “Television for the home.” The
beginnings of mechanical television can be traced back to the discovery of the
photoconductivity of the element selenium by Willoughby Smith in 1873In 1884,

026463 294
Paul Nipkow sent images over wires using a rotating metal disk technology with 18
lines of resolution. Television then evolved along two paths, mechanical based on
Nipkow's rotating disks, and electronic based on the cathode ray tube. American
Charles Jenkins and Scotsman John Baird followed the mechanical model while
Philo Farnsworth, working independently in San Francisco, and Russian émigré
Vladimir Zworkin, working for Westinghouse and later RCA, advanced the
electronic model. Viewed superficially there often appears a considerable chasm
between the design evolution of 'harder' systems of technical products and potentials
(purposeful acts of design like the television receiver – the ‘box’), and the
development of 'softer' systems of human societies and individual interests (which
constrain and denote purposeful acts, like the art of broadcasting and media content).
This relates to the notion of television’s 'double articulation' as a technology and as a
medium (Williams, 1974; Silverstone, 1994). But concurrently or more precisely
through their interaction, the television set manufacturing industry and the
broadcasting industry evolved in the early stages in a symbiotic way where they
came to take on new, defining, qualities and characteristics, in a similar fashion as in
the partnership with Microsoft with their operating systems and Intel, with their
chips. As television-the-technology developed at the broadcasting and receiving
ends, so did broadcasting-the-performance, the human element and agency in the
system. The Baird system gave way to the Marconi-EMI system due to technical
limitations, with the BBC originally running both in tandem to see which one was
best. In essence this ‘battle of the formats’ took its place against others which are
significant in the networking of homes. Some others include AC or DC electric
mains, VHS vs. Betamax, Apple Macs vs. Wintel the epic battles of the systems.

Page 121 - the battle of the systems" had become far more complicated than a
technical problem awaiting a simple technical solution, it ended without the dramatic
vanquishing of one system by the other, or a revolutionary transition from one
paradigm to another. The conflict was resolved by synthesis, by a combination of
coupling and merging. The coupling took place on the technical level; the merging,
on the institutional level.

026463 295
The efficiency of each technology is greatly enhanced by the availability of the other
complementary and interdependent technologies (David, 1988 and Antonelli, 2001).

The present study while perhaps not on the pervasive level of these ‘battles’ was cast
as one which would lie between domestic, family orientated, ‘walled-garden’
interactive television services made available through cable TV networks. The
‘walled garden’ approach limited the options but would provide a family friendly
environment of content contrasted with the ‘wild’ and ‘open’, ‘unregulated’ internet
available through the landline telephone system. More than a matter of the nature of
the content in terms of purely its message, it concerns the nature of which content
could actually be conveyed by the system. The bandwidth or capacity to convey data
such as video material was possible at the time only via the cable system. However
the landline telephone, or ‘twisted-pair’, while having a much greater reach
connecting many more households than the so-called cable islands (including remote
rural households which were not commercially viable to link to cable), did not have
enough capacity to convey video. At the time of the present trial, there was another
trial in Ipswich was being run by British Telecom (BT), which was aimed at
providing interactive television over traditional landlines.

Whereas television sets – the technology – is increasingly realised today as a flat,


plasma or LCD screen on a wall today, it was only recently that it was very much an
imposing large veneer box sitting prominently in the living room - a large piece of
household furniture really – which was used to house the huge cathode ray tube. The
reach and the site of the early transmitters denoted who could receive and who could
not. Cable television originated in the United States in 1948 to enhance poor
reception of over-the-air television signals in mountainous or geographically remote
areas. It also came as a solution to broadcasting in built-up zones such as New York
where aerial reception was also difficult.

Performance issues have also evolved. When television began, those who appeared

026463 296
on it acted in a rather wooden manner standing in front of the microphone. This was
due to their radio broadcasting training and the technical limitations of the pick-up
which had not been redesigned as yet for free movement.

In fact the discussion of the evolution of television-as-technology and television-as-


technology as procreator of broadcasting – television-as-medium – can be used as an
analogy of many kinds of social, economic and technical systems. The most obvious
of these is the analogy of mass media – broadcasting – and mass manufacturing. For
instance it is obvious that humans enliven economic and technical networks of
interconnected objects and meanings. Their agency denotes purpose and meaning –
value - to activities. In the beginning of the telephone, before its most singularly
defining function was realised – that of transferring voice to voice messaging – it
was used for multiple and sometimes highly dedicated functions such as connecting
people with butcher shops (and only that shop) and the broadcasting of church
meeting and services (Martin, 1991). In a sense, unlike television and radio
broadcasting, telephone systems rely almost entirely upon its users to create meaning
and develop value in what it does. It does not reply upon actors, presenters,
comedians, hosts, news anchors, effects, and so forth – experts - it relies on the needs
of individuals wishing to speak, organise, and socialise with other individuals.

Some functions or modes of thought persevere, whilst others fall into disuse and pass
away – we need only consider the legal system to understand this. While there has
always been a close, reflexive relation between technology and the human condition,
a question remains: Do new technologies truly open new behavioural possibilities,
new interests? Can they close them? Can they create them? What is the reach of the
transforming power of either society or technology, in what is essentially best
understood as a co-shaping process? As Jackson Pollock alludes: “new techniques,
new needs.” The unconscious processes of the human mind have their correlate in
the unseen process of production, distribution, payment; so much is not available
directly to the end-consumer, the end-user. These questions form the core of issues
raised and addressed in this study.

026463 297
Broadcasting, mass advertising, production, and distribution appear as critical in the
'age' of 'personalisation' and 'customisation' and social networks as they ever were
before the advent and growth of digital networks or 'e-' or 'm-' commerce. But the
means by which people may come to know, apprehend, access and pay for goods and
services is changing. Approaching the millennium one would have had difficulty
shielding oneself from the abundance of discourse perpetuated regarding the
'potentials' of the Internet, as nearly every day, some source or another cited how it
will change every aspect of how we work, how we play, and again echoing earlier
citations regarding television’s intimate relation in shaping out worldview, how we
even view ourselves. Even management notables such as Peter Drucker recognised
the internet’s potential in:
" . . . profoundly changing economies, markets, and industry structures;
products and services and their flow; consumer segmentation, consumer
values, and consumer behaviour; jobs and labour markets. Its impact is said
to be even greater on societies and politics and, above all, on the way we see
the world and ourselves in it." (Drucker, 1999: p.47)

This thesis documents as its empirical case study the technical, social, and
organisational issues arising from a particular attempt to augment which we might
describe as the ‘present functioning of television’. More precisely, this meant
augmenting the traditional broadcast model of television, namely that homes
possessed a television receiver, and all programming was broadcast to all those
capable of receiving a signal simultaneously. It also speaks something of the
experiences of those shaping it, in particular, designers, users, managers and partners
and considers trials as experiments, as supposedly naturalistic environments or
‘biospheres’ of technical, social and business development. Ulrich (1983) espouses a
new paradigm of planning based on a purposeful systems approach. The approach
highlights the importance of designing “for the development of intrinsic motivation
and critical reflection on the part of those who will have to work and live with the
designs” (Ulrich, 1983, p.335). They replace the lab as the site where tinkering is
done with products and processes aimed at the mass market. They are information
intensive environments, and knowledge and research methods are tried and tested by
its exigencies. While at the time of the study not all homes possessed a computer,
and nor was there any real guarantee at the time that even they ever would, television

026463 298
with augmenting technology enabling new functions, new ways of opening home life
to public intrusion and intervention, new ways of ordering and selling goods and
services, new ways of communicating in communities and beyond. In order to
provide these new literacies would have to be learned, first, by technical developers
bootstrapping pieces of technology together, and secondly, by commercial and
governmental agencies who would use and create for the new medium. And once the
developments of new litaracies are invested within they provide their own inertias to
change. A final social group of course were perhaps those most important in the
entire constituency of actors involved in the trial - those who would finally pay for
these services – the end users, the consumers or customers of the system. Their
feedback, impressions, behaviours, preferences, decisions, likes and dislikes would
guide development of software and hardware elements, of interfaces and functioning.

Abilities to alter or change broadcast material are slim or negligible. Considering the
ubiquitous nature of television and its primacy within everyday activities this
represented a prospect was also viewed by the developers to imply significant and
radical change, not only to technology, but also to weave pervasively into the nature
and fabric of everyday commercial and domestic practice. In 1993, a British high-
technology firm, Acorn Computers Group plc, based in Cambridge, UK, with a
substantial pedigree in advanced computer design, were presented with a proposition
by a global media conglomerate: Could they create the future of domestic media in
the shape of a networked interactive television (i-Tv) service?

The Cambridge Interactive Television Trial, which henceforth will be referred to as


simply the 'Cambridge trial' was launched by Acorn On-Line Media (Om) an
operating division of Acorn Computers Ltd, working in collaboration with a series of
technology and content partners, and spun-out from its parent company for the sole
purpose of developing for the new medium of interactive networked multimedia,
including digital television. This response was particular technical and service
solution, a particular configuration of technology and services. In order to raise
necessary venture funding for this highly ambitious project they were required to
trial their technology – that is to test its programming and technology at various

026463 299
levels. It had to prove its prowess to deliver a wide range of digitised content and
services. Acorn’s chief contribution to the system which had yet to be built was a
digital 'set top box' (STB) - basically a computer system optimised and refigured to
use a television receiver as a screen, and which used a remote control rather than a
keyboard as an input device, and which would connect to a standard cable TV
communications network via some sort of modem or network card.

But a STB is only a single, but nevertheless key, component of an overall i-Tv
system. Building an operational system required [social and economic] alliances with
partners who would provide these other necessary [technological] components,
including the crucial communications infrastructure and content material. Only
working together, in concert, technologies and commercial firms, would a
communications system capable of delivering services such as video-on-demand
(VoD), as well as other kinds of content and screens, be made possible and available.

The object of the Cambridge trial was not purely to explore the technical provision of
interactive entertainment, but just as importantly to explore new services, new
content, commercial practices, organisational principles and consumer responses
over several distinct trial phases. Each phase would witness iterative improvements
to the technology and services, aiming towards the launch of a full commercial
deployment. This was considered as the prototype and model for systems to be
deployed in other cable networks worldwide.

In a purely technical sense each phase would represent increase in the stability and
robustness of the system to deliver content. In a media content sense, each phase
would also witness the development of a more comprehensive range of consumer-
attractive services, content and features. This would then drive growth in an
organisational sense with the trial involving more content and service partners
providing additional content, services and technology.

Each phase would also mark out the trial's development in a social sense through the
involvement of ever more heterogeneous groups of trialists – users of the prototype

026463 300
system. In phase one the trialists would consist largely of interface and STB
designers and engineers themselves. They would make corrections to the hardware.
Phase two would see service partners staffs connected and trial content and services
would be tested. They would begin to test the usability and the feel of the system as a
consumer electronic. By phase three, with the technology robust and delivering a
commercially attractive variety of content and services – trialists at this stage would
begin include members of the general public. They would provide ideas on how to
cost and charge for services. In fact, a nominal fee was expected to be levied at this
stage for participation.

As such the trial was, for Om, intended as a kind of scaffolding which would link the
ambitions of the firm and its technology to the wider commercial world of media. It
would do so, from the social and organisational perspective, incrementally and
exponentially – thereby reducing risk. It would make revenues from proprietary
software and the sales of STBs. Issues of content and service provision would
devolve to those best suited for handling this [much in the same manner that the
BBC, as an established radio broadcaster fostered television-as-technology].
However, a number of technical, business and funding opportunities as well as
hurdles required that this strategic vision had to alter.

The trial officially began on the 30 th Septemeber1994, drawing to an end in April


1997.

The structure of this thesis

This book divides into two main sections. The first four chapters offer a discussion of
the literature from a wide range of fields that were combined to inform the thinking
behind this study. The second section focuses much more specifically upon the case
example of Acorn and the Cambridge trial.

The review of the literature, as well as tackling and coping with issues arising
through conducting the study led to the development of contextual usability (CU).
This is a framework which aims to place the notion of 'usability' – i.e. the ease
through which something may be used - in context and alignment with other

026463 301
elements contributing to an overall (whole) experience or gestalt of a design product
or service. Beyond simple collaboration, which is the result of an alignment of
interests, an emergent network can create new and complex coherences out of
divergent interests.

CU’s analysis can be applied to both designer and user perspectives. For instance,
many of the issues raised by the study were plagued by epistemological and
methodological problems surrounding the notion of 'usability'. Predominately,
usability as an ethos, method or technique at the time of writing (early 1990s) had a
focus mainly upon workplace computing. Comparatively work done on domestic
ICT consumption and use was very sparse, except diffuse studies done by the more
sociologically orientated studies in works such as Consuming Technologies: Media
and Information in Domestic Spaces (1992), a compilation of essays edited by Roger
Silverstone and Eric Hirsch.

Bearing in mind that there was not anything reminiscent of the levels of penetration
of PCs or home-based computers in the early 1990s as there was towards the end of
the decade; workplace computing had been the subject of a vast range of studies as
opposed to the more subtle, complex and nuanced contexts of domestic computing.
This was reflected in the papers which were being accepted at CHI etc.

This reflects in the present work through the lack of direct primary source material,
and the attempt to tackle the board picture of what television ‘is’ and what is required
in a design or innovation management sense when you engage in a project to change
it. So the first section of this thesis details a smorgasbord of frameworks, ideas and
theories which tackle the rich interplay and interaction characterising the relation
human beings have to media and technology.

The first section opens with the present chapter and my purpose here is to spend
some time considering the ubiquitous nature of the notion 'interactive' in ‘interactive
teleivsion’ considered as a category in itself. With its roots in cybernetic theory, and
following the theme of the thesis so far, from a human interactionist perspective,

026463 302
interaction is inseparable from experience. Human conversation requires effort -
listening, talking with consideration, taking the occasional risk with direction—and
offers the reward of mutual enrichment. What we interact with, who we interact with,
how we interact with them, what we use to interact with them, shapes the dialogue
and even how we come to see the world. Epistemologies, beliefs, opinions and
attitudes are not freely formed, nor do they arise ab nihilio, they come and are shaped
by granular insights, implicit knowledge and prejudices, themselves constantly under
review and reinforcing of refuting what we hear. Following the cybernetic notion of
feedback loops, it is clear that ‘interaction’ either shapes experience, that is it is
embodied, as in the development of knowledge or expertise, or creates and enables it,
as in the outcomes of social or technical interaction. This will entail a range of
implications; experiential, interpretative, technical, business, learning, behavioural to
name but a few. Actions and interactions can involve presuppositions about the
environment in which those actions and interactions take place, and to a large extent
this is certainly the case with interactive media where the key environment is the
social and geographical ‘home’ – in all that that category brings with it. As
interactions often occur within a designed and bounded environment of cognitive
possibilities that also shape and constrain behavioural potentials, the available
interaction set, which can include interfaces to devices as well as intangible social
elements such as ‘household and family rules’ thus any emergent properties or
discoveries and/or expertise regarding a technology will be affected. This was
certainly the case in the study of users on the trial, where overall there was some
suggestion that those most articulate regarding the propensities of the technology
were those who had the strictest regime regarding its use, and who in practice used it
least. They contrasted with those whose account said little about the technology but
in practice used it much more liberally. When reporting to management boards it is
more convenient to recite those who had sophisticated and highly articulated views
and opinions, even if what they said did not reflect their direct experience of using.
This brings into question then, what is fed back into development processes exactly?
How for instance is marketing and product development to some extent shaped by
stories, opinions, even myths? User research is rarely interfaced directly with design
and engineering but is passed and considered through a managerial level. In some

026463 303
cases this is enforced due to the politics of the matter. Designers and engineers have
been shown not to like, or approve of criticism of their creations, and may
misinterpret feedback from users due to bias towards the technology. The same can
be true at managerial level as well. This rhetorical notion of technology development
came to be a pivotal idea in the present work, and as we shall see later convincing
examples of this surfaced in the fieldwork study.

It would appear that the nature of interactive design is always such that some
opportunities are opened whilst some are closed, and anticipation of opportunities
form an inherent part of the representational and symbolic aspects which above and
beyond actual use, characterise these systems and their perceived against actual use
values. What comes to be opened, and what is closed is ultimately the prerogative of
those designing and developing of the technology and system. However, their
choices and options are in turn shaped by a number of distinctive influences of which
the interoperability of available technical components (technical choices) and client,
partner company (social) feedback are only two examples in an entire constellation,
constituency or network.

Structure of the thesis

In respect to the present study there are a number of themes or contexts, or wicked
design problems to be addressed. The first concerns television. The argument here is
that because of its prominence in everyday life, the home, and as a means for people
to assert and create identity, how they may become informed regarding the world at
large beyond their homes, how in other words, they develop their worldviews or
Weltanschauung. Television is no ordinary and straightforward technology. It runs
upon the deign efforts of many, many creatives. It is a cultural phenomenon, and as a
subject of study it commands a massive literature which straddles many fields.
Television for instance, in opposition to a much simpler tool such as a tin opener, has
its 'double articulation'. As a conveyer of messages and as a technological artefact
(Williams, 1974; Silverstone, 1994). It may also represent the aspirations and
identity of a particular culture (such as the BBC as a hallmark of British society), be

026463 304
a 'friendly' voice when one lacks company, be a child-minder (the 'one-eyed
babysitter'), and so forth. Thus, it provides then a range of tangible or explicit
messages, bound to a plethora of implicit, tacit meanings and uses. Television as a
functional and ‘ideological octopus’ (Lewis, 1992) device creates a new set of design
imperatives which have to be met in order to develop a successfully working system.
They also raised a number of legacy issues, such as its morals and ethical dimensions
of programming, ‘effects’ and difficult quasi-philosophical problems regarding just
how changes in media, impact the way messages are conveyed and received and
understood. On a much more practical level, the technology developers already had a
long standing relation with the BBC (the Acorn BBC Micro) which had already used
the television as a computer monitor.

What is relevant to the present study was that for many of the participant companies
involved in staging on the Cambridge Trial the home had to be de-familiarised as an
environment or space in order to be considered as a set of design problems (Zygmunt
Bauman calls 'defamiliarisation' 283
. 'Defamiliarisation,' he explains in Thinking
Sociologically (Blackwell, 1997), 'refers to an important sociological strategy:
Unpicking the 'taken-for-granted' aspects of the social world until we can discern
their underlying structures and rationales.' This idea also links to the notion of
‘anthropological strangeness’ which has a linage through the phenomenology of
Alfred Shutz, (1944’p.500) and the ethnomethodology of Harold Garfinkel (1967,
p.9) In a sense it had to become something of a Mars or a desert island in order to
show or highlight what was needed to be accomplished in a technical or in a service
sense. It was thus a key design concept along with the idea of domestication
(Silverstone) in this study. Economic historians are in agreement that the onset of the
Industrial Revolution is the most important event in the history of humanity since the
domestication of animals and plants. Silverstone used domestication as a metaphor
for bringing things, technologies, into the home [from the wild] and living and using
them. Domestication as an experience links to what psychologists call this

283
“I’d say that social science has twin roles to perform. Those of “de-familiarizing the familiar”
(debunking its alleged self-evidence) and “familiarizing (taming, domesticating, making manageable)
the unfamiliar.”” “De-familiarize the familiar”and “Familiarize the unfamiliar” Michael Hviid
Jacobsen and Keith Tester [April 2013] [source: http://www.the-essayist.org/2013/04/de-familiarize-
the-familiar-and-familiarize-the-unfamiliar/]

026463 305
habituation and economists call it declining marginal utility.

The process of defamiliarisation also then had to apply to everyday life – standard,
familiar, everyday routines and roles had to translate to technical features and
functions in some sort of reasonable and meaningful way. Otherwise they simply
would not find use. What we do every day would be augmented or facilitated by
technology and service options. Clearly, there are typical examples; categories
regarding the home, and regarding what particular individual do on a daily basis.
These had to be identified and considered relative to what the technology could do
and provide.

Another major theme was that since their inception in the 1940s, computers have
been variously described as general purpose machines largely due to their
application in solving many different kinds of difficult mathematical and arithmetical
problems. They were open and flexible in function and purpose with respect to
performing calculations, and they were more accurate than human calculators. Their
use was largely constrained to scientific, military or business purposes. However, in
the 1960s we can witness the advent of improved interfaces, processor speeds,
storage media and digital communications leading to personal computing in the late
1970s and 1980s, and multimedia and the rise of digital networking in the 1990s. The
computer became refigured as a much broader device providing the means to do
present and access information, engage with entertainment, and order physical goods
‘online’. As such, computing communications were becoming of interest to a much
wider set of business and institutional players and to an increasing range of private
users beyond enthusiasts (Haddon). We should remember that in 1983 Britain
proudly boasted the highest level of computer ownership in the world, (Lean,
Modern High quality global journalism requires investment. Please share this article
with others using the link below, do not cut & paste the article. About one in 10 UK
homes, he writes, had a computer, more than in the US or Japan. At the same time,
the UK had several computer manufacturers, one of which was making more of them
than any other company in the world — Sir Clive Sinclair’s Sinclair Research.

026463 306
Advances in graphical interfaces, sound cards, data bases and new multimedia
authoring software were enabling a computer to be anything from a games machine,
a web browser, a communications device, as well as offering unprecedented abilities
to convey video, graphical, games, text, animations and photographic material. This
context then, was the development and evolution of computing and its composite
technologies of processing (i.e. CPU), storage (i.e. hard disks), applications and
operating systems, and abilities to convey sound and vision (sound and video cards)
and networking. It was not social scientists building the technological system,
authoring the content and services, and fostering the commercial partnerships, it was
a technology company. Their core competencies were chip design, personal
computers, and software development. Chief amongst the technical dimensions and
challenges were harnessing the new multimedia potentials of computers, computer
communication networks, or more precisely perhaps, distributed interactive
multimedia, to provide for, author and convey media content and services.

Another context was the rise of the World Wide Web and the internet, which was
growing at an unprecedented rate. However, at this time in the early to mid-1990s,
the internet connected PC and dial-up remained the prevail of the specialist and
‘technically-minded’ user. PCs were sociologically understood at this time as more
suitable for a predominately fringe, male enthusiast, audience. Much of the
unregulated content on the internet was created by such an audience and was often
unsuitable for a family audience, and other issues to do with PC maintenance and
configuration were considered far beyond the lay audience. The view was that this
would develop, but develop parallel to pre-ordained, professional authored, packaged
services more convenient, more apt, and more appealing to a more general or family
audience. So accompanying the hard technology, the chips and other components
which permitted functions, was a revised set of moral imperatives, usability issues,
visions and expectations regarding that would use the new device, how it could be
used and what they would use it for.

A further context was the development of creative expertise which would be needed
to develop the new content and services. At this time there was paucity in those who

026463 307
could develop the multimedia content and services, the databases and other
technologies. This was a competency which has to be learned and built, primarily
through getting involved. This again, was concerned not only with what was needed
from the standpoint of technology development and the innovation of infrastructures,
system components and their interoperability and user interfaces, but also what was
needed in terms of compelling service offerings, and what worked visually and
graphically.

A further context concerned the business dimension, not only of the technology and
trial (i.e. who paid for what, what they paid for, and what benefit was expected to
device from this investment), but also how business could structure around the new
potentials offered by the system if it were to roll out to the general public. In a sense
the users of the system were not only what is termed end-users – i.e. members of the
general public – but also businesses which would use the system in order to deliver
their content or services, such as video programming, games, or online shopping.
Nathan Rosenberg noted that inventors of new technology are often not good
predictors of how they would be used (Rosenberg, 1994). The inventors of the
telephone, radio, laser, steam engine and VCR did not foresee the impact their
inventions would have. The inventor of the radio thought it would be used mostly by
steamship companies. But whether this was understood by the technology firm
leading the organisation of the Cambridge trial or not was not clear. The likelihood is
that they set their own boundaries upon what they would commit to in terms of
technology, development and organisation. The visions and predications of the
overall service offering – that is, what would been seen and experienced by trialists
at home – was then an open-ended affair, dependent upon the individual
contributions of service industry trial participants, each of which were themselves
limited in their knowledge of what they were doing. They were a plethora of cross-
sectorial industry players with very different business and organisational agendas and
operational philosophies and cultures. Banks, market researchers, advertisers,
communications regulators and retailers often have quite different worldviews and
organisational cultures, quite different and particular views of the public as subjects,
customers, viewers, users and citizens etc. - some of which were conflicting and

026463 308
sometimes strangely convergent. As such the study outlines how the players and
users involved in the development and trial learned the ‘new vocabulary’ of
networked digital multimedia and digital business. It was also about these firms
finding common ground and working together in ways and manners never previously
required nor perhaps practiced. It prompted many questions regarding for instance,
how online payments could be made for services and goods, or how goods could and
would be delivered. While today many of these problems may seem settled in some
respect, at this time they posed relatively new dilemmas in the organisation of
business.

A final but certainly not minor context arrives in the way the technology offered new
means through which to research users. The invisible hand so to speak of digital
networks is that by using them, people navigating pages, watching, certain
programmes (opposed to others), pressing buttons, filling out forms etc., they
generate data. Some of this could be meaningful to improve sales, target advertising,
or even improve aspects of usability. This aspect of digital networking was clearly
seen as valuable to participants who would pay to be part of the trial, and of course
such system-logging of user behaviour and collection of user feedback [and content]
has since become a defining aspect of online use. At the time of the trial it was only
being realised.

Perhaps a parallel strand to this particular context was the idea of the trial itself as a
means to generate not only research but also creative action on behalf of those firms
who were involved. Trials, experiments, tests, invite pondering regarding just how
naturalistic or close to reality they are to how they could or may be used in real life.
Any research falls somewhere on a continuum:

 Controlled experiment
 Quasi-experiment
 Longitudinal design (where the same participants are measured in each wave)
 Trend Design
 Cross-sectional study

026463 309
 Naturalistic (descriptive) field study

At the controlled experiment end of the spectrum ‘causal effects’ are that which are
being sought out, whereas at the naturalistic end, richly layered description of
phenomena. 284

Early access programs, alpha and beta programs, pilot programs, open source
software, and field trials are examples of customer feedback programs. These
programs have both quantitative (e.g. find and fix bugs, assess interoperability,
evaluate performance, etc.) and qualitative goals (e.g. market readiness, user
acceptance, feature set completeness) The thought being that the more they are a
simulacrum, the more ‘pretend’ they are, the less veracity they will possess, the less
telling they may be, the less value they have for informing design and re-design.
Their capacity to produce useful data and information will likely degrade the more
they depart from the situated use of a product or service. This does not preclude the
idea that they may open up some alternative avenues of innovative activity, such as
new features or functions. What is put into creating the simulacrum, its granularity,
wholeness, completeness if you will, has a direct impact on what can be learned.285

The two strands or faces of the system and the trial; the intangible, social, symbolic
and communicative aspects, and the more tangible issues of hard technology options,
wires, bits, pixels, interfaces, service authoring and development, and technical
system components made for wicked problems. As the study unfolded there were
many confluences which upset plans and anticipations. The various contexts – the
home, everyday life, television, the internet, consumer research, human computer
interaction studies, ecommerce, the organisation of business, the creation of digital
services and content, and technical innovation each intertwined creating imbroglios
of action, reaction and transaction. So it was not only the notion of use and users, or

284

285
“...new technologies may not only lead to new arrangements of people and things. They may, in
addition, generate and "naturalize" new forms and orders of causality and, indeed, new forms of
knowledge about the world... technologies may generate both forms of knowledge and moral
judgements.” Arkrich (1992) p.207

026463 310
consuming and consumers, that was being addressed in this trial and its conductance,
but that of the use context – chief among these socially were the specific use contexts
of ‘home’ and ‘everyday life’. They were the focus which guided development
beyond that of an abstract fixation upon ‘users’.

The context promoting use, or the context promoting usability, usefulness, or usage,
may vary. Of course ‘users’ are critical to providing experiential feedback regarding
which aspects of the operating conditions and environment and their expertise and
previous experiences influence the present situation of using, but observation and
system tracking are also important. So put bluntly, rather than the 1990s paradigm of
homogenising a social group, say such as secretaries, so that they become
representative users of changes to a an word processing application interface – that is
a focus on manipulating technology – or working out a training program which
quickly trains secretaries to use the word processing application – a focus on
changing the users – the argument is regarding careful observation upon changing
the context of use – which may include outsourcing mail shots, or going paperless. A
focus upon context is a move from a myopic concentration upon the device or
technology – the thing – or a parochial view of the person and their assumed role –
the user – to a view of the circumstances, histories, geographies and locations,
motivations, and possible outcomes of use – the contexts.

A crucial mission of the artist, educator, and activist is to actively participate in this
process. To change the conditions of a social arrangement, we must change the way
that that arrangement is imagined. But since the way that people imagine an
environment is both determined by and determinative of the kinds of practices which
take place within that environment, this intervention on the imaginary must
necessarily also be an intervention on the real. The feedback loop between the
imaginary and the real is fundamental to the structure, function, and evolution of all
human-built synthetic environments, including the control of nature. In terms of
technical innovation, what designers, producers and marketeers believe are the
characteristics, attributes, features, and functions of what they devise, develop and
build, who they imagine the user to be, what they imagine the consumer to want or

026463 311
need, what they imagine the person to think or do. Then carefully check this against a
raft of exploratory methods aimed at understanding the limits of the technology and
the person and their interests with respect to it. This said, this is longitudinal, and
also depdendnet upon other things happening in markets, culture, economy etc.

The present chapter considers interaction with respect to generating knowledge, as


well as its influence in processes of innovation. With this in mind I focus particularly
upon the interaction of new ideas and innovations which must then 'interact' with
existing regimes, received wisdom, constraints and enablers of innovative activity. I
then place interaction more specifically within the technological context of the
Cambridge system, and consider the problematic nature of interactivity as a defining
quality of new media systems such as i-Tv.

In the following chapter, Chapter 2, the theme of interaction is brought forward by


considering emerging frameworks whose aim has been to map technical and social
elements as they combine to give rise to products. These frameworks, by and large,
owe much to general systems theory (GST), and related areas such as cybernetics
and complexity for their development. These are also areas that featured in the
present study as they influenced the thinking of one of the senior managers at Om as
he tackled the problems of organisational development and learning as well as more
precisely learning from the user (through trials and system logging) during the trial.

Mapping the interaction and interrelation of technical and social elements is


especially relevant in the case of the development of communications systems as a
user's interaction with a system inevitably entails explicit or implicit interactions
with elements, actors, networks, constituents and constituencies beyond the interface,
or the technical, business and human resources of any one firm. I have already
mentioned that the analogy of these social, technical and economic systems with all
others. As such this chapter implicitly refers to interactions behind the interface or
screen – i.e. all the actions. Interactions and transaction which culminate in creating
the technology and networks as well as sustaining and providing technology, content
and services for end-user consumption.

026463 312
After presenting an outline of GST and related concepts, I outline two recent
frameworks – actor-network theory (ANT) and sociotechnical constituencies. Both
of these frameworks have arisen under the wider aegis of Science and Technology
Studies (STS) and aim to map out the unique gestalt of influences, capabilities and
perceptions that culminate to particularise technical products and systems. They are
also preoccupied with the organisations which come to support and maintain their
development and use. This chapter is closed by discussing their relative merits and
shortcomings relevant to the present study.

Chapter 3 shifts focus to the individual level, and considers how design products
themselves come to be apprehended through interaction with concrete and symbolic
attributes. This chapter introduces CU as a framework which aims to place the
situations and circumstances of 'use', and the construction and experience of
'usability', in context with other social and experiential elements contributing to an
overall (whole) experience of a design product or service. These obviously extend in
front of the interface or screen, as they focus more upon the user's background,
interests, predisposition and social circumstances and constituencies.

Chapter 4 offers a discussion of methods that can be used to bridge the gap between
designers and producers and consumers and users. It details recent trends towards
interpretative, naturalistic, ethnographic styles of inquiry (such as used in this study).
It also considers the new potentials for user research through registering changes
within digital networks initiated by use and users (system-logging or 'sys-log' ).

In the second section focussing upon chiefly upon the case itself, Chapters 5 and 6
adopt a more typical case study style of reporting and outline, first in Chapter 5 the
development of Om out of Acorn Computers Ltd. Chapter 6 moves towards the more
specific organisational, technical and social development of the Cambridge trial and
system. Chapter 7 explores what I term the 'process' of the research and details, in the
manner of a 'reflexive ethnography' and presents a more chronological set of
interactions I had with managers, designers and trialists that featured in the trial. It

026463 313
highlights some of the tensions as well as the difficulties experienced personally with
the research process, and of the organisational and knowledge generating problems
involved in developing a concerted approach to conducting a meaningful user
research programme in the trial itself.

In reflexive ethnography, the ethnographer becomes part of the inquiry (Kincheloe


& McLaren, 2000). Reflexive ethnographers use their own experiences in a culture
“reflexively to bend back on self and look more deeply at self-other interactions”
(Ellis & Bochner, 2000, p. 740). According to Ellis and Bochner (2000), reflexive
ethnographies “range along a continuum,” starting with texts that focus solely on the
writer's experiences to ethnographies “where the researcher's experiences are actually
studied along with other participants, to confessional tales where the researcher's
experiences of doing the study become the focus of investigation” (p. 740)

Chapter 8 concludes the book by drawing together the various themes and
propositions covered in the theoretical and case chapters. Contextual Usability is
itself placed in context with sociotechnical constituencies and the subsequent fusion
of frameworks is suggested as a means by which analysis of the experiential quality
of products can be related to the wider spectrum of influences which culminate in
product's development and use.

The value and scope of this study

The value of this study lies in its ability to depict the forces that shape and constrain
innovation and design, particularly in a highly charged, visionary, volatile, dynamic
new media industry sector. It does not come without considerable pedigree into a
very wide-ranging and diverse set of ideas that stem back into historical record.

This new or interactive media sector, was at the time of the study, nascent and only
just fully forming, and the methods used to study its various aspects have themselves
will continue to be subject to considerable revision and innovation. For example
there were many question marks in 1993-1994 whether the internet would be ‘the
way’ or cable islands and interactive TV would dominate as the medium that bring

026463 314
the ‘superhighway’ into the home and open it to invasion of new forms of
communication. This study suggests this is likely to continue for some time and
emphasises the complex and idiosyncratic nature of design for idiosyncratic and
personal use, particularly in the networked age. It straddles wider sociological
debates regarding agency and structure, and begs the question who structures what
with what and towards which end? Interaction is hardly dialogue in the human sense
that there is a prospect of open communication between two parties limited only by
their worldviews, knowledge and command of language. It means channelling,
delimitation, formatting and constriction of communication while at the same time
enabling forms of communication which are not possible through face-to-face
interaction. In this sense they mediate communications according with their
purposeful design and use. However, this loop is only closed in people using the
system and they may do so creatively and constructively and realise and work with
the limitations and enablement. It is only through this behaviour that such products
democratise and become successful in the market.

It provides generic insight into key issues arising within a trial of an advanced
technological system, highlighting the particular need to go beyond a focus strictly
upon technology, its features and funcationality to consider how it blends with wider
concerns regarding content, social governance and knowledge management. This is
pertinent when considering particular strategic ends (in this case knowledge
generation arising from consumer-user research). It maps the organisational
development of a technology and marketing trial, and outlines difficulties
experienced in evoking strategies for producing a concerted user research
programme.

Finally it should be noted that this study began with the intention of being 'user-
focused' – that is mainly concerned with matters to do with the appropriation, use
and 'sense-making' of interactive television users as it came to be situated in homes.
However, the study which emerged came to focus more on how usability – as a
desirable and objective quality of products, as a subject of study, as an ethos, and as a

026463 315
technique for generating user knowledge - situates (or fails to situate) within wider
contexts of the organisation of innovative activity and individual experience. As such
it is not itself a study of users (although user interviews are included as Appendix 1),
but much more a study of user research in relation to design and design management
processes in the trialing of new technologies.

Chapter 1 – The home

Television and the nature of the 'domestic' and the 'everyday'

Homes and domestic communication devices bridge and mediate the divide between
the global, social and the personal and private. But to capture this prospect fully we
need to shift from old ways of thinking, being and acting that correspond to agrarian
and industrial “revolutions” to ones that open our eyes and enable us to see the
particularities, contradictions and possibilities of our current state of development,
the knowledge, information, post-industrial, digital, networked and urban age.

Archaeological data indicate that various forms of domestication of plants and


animals arose independently in six separate locales worldwide ca. 10,000–7000 years
BP (8,000–5000 BC), Humans began to realise the prospect of living together in

026463 316
settlements. Domestication of plants and animals, the building of settlements
peppered with homes, discussions, preparation rituals and warehouses, transport
infrastructures and the use of animals as carriers of goods supported sedentary and
regularised food production (i.e. Childe, 1936). The development of long distance
trade and commerce ensued as did the building of cities, divergent cultures and
development of craft. 9change this it has been moved) The widespread appearance of
sedentism and then agriculture 15 000–8000 years ago, which changed human
experience in ways that some believe was fundamental for the modern mind (Cauvin
2000; Renfrew 2001; Watkins 2004b )286.

They began to live in separate dwellings within these settlements. As discussed in


the introductory chapter, settlements themselves were rendered possible by the
domestication of animals and the development of agricultural techniques, they made
possible for the first time for people to remain in one geographical area for protracted
periods. These developments provided the basis for concentrated high population
densities settlements, specialized and complex labour diversification, trading
economies, the development of non-portable art, architecture, and culture, centralized
administrations and political structures, hierarchical ideologies and depersonalized
systems of knowledge (e.g., ‘private’ property regimes and ‘personal’ writing and
inventories).287

The nomadic hunter-gatherer existence gradually gave way to the city dweller, as
techniques developed and were realised and surpluses were made and commerce and
storage methods became possible. Social structures emerged and governance and
bureaucracies were formed. The King, chief and global is the dominant level, the

286
Cauvin J. Cambridge University Press; Cambridge, UK: 2000. The birth of the gods and the origins
of agriculture. Renfrew C. Symbol before concept: material engagement and the early development of
society. In: Hodder I, editor. Archaeological theory today. Polity Press; London, UK: 2001. pp. 122–
140 Watkins T. Building houses, framing concepts, constructing worlds. Paléorient. 2004b;30:5–24.
287
Childe, V. Gordon (1936) Man Makes Himself. Watts and Co., London.
Childe, V. Gordon (1950) The Urban Revolution. Town Planning Review 21:3-17.
Trigger, Bruce G. 2003 Understanding Early Civilizations: A Comparative Study. Cambridge
University Press, New York.
Johnson, Allen W. and Timothy K. Earle 2000 The Evolution of Human Societies: From Foraging
Group to Agrarian State. 2nd ed. Stanford University Press, Stanford.

026463 317
site of power, the state, of representation, of the built domains of institutions,
ministries, cathedrals and monuments. Childe presets the traits or characteristics
which define the rise of states from cities: ‘Ten rather abstract criteria, all deducible
from archaeological data, serve to distinguish even the earliest cities from any older
or contemporary village’ (p. 9). They are as follows:
1 ‘In point of size the first cities must have been more extensive and more
densely populated than any previous settlements.’ (p. 9)
2 ‘In composition and function the urban population already differed from
that of any village … full-time specialist craftsmen, transport workers,
merchants, officials and priests.’ (p. 11)
3 ‘Each primary producer paid over the tiny surplus he could wring from the
soil with his still very limited technical equipment as tithe or tax to an
imaginary deity or a divine king who thus concentrated the surplus.’ (p. 11)
4 ‘Truly monumental public buildings not only distinguish each known city
from any village but also symbolise the concentration of the social surplus.’
(p. 12)
5 ‘But naturally priests, civil and military leaders and officials absorbed a
major share of the concentrated surplus and thus formed a “ruling class”.’
(pp. 12–13)
6 ‘Writing.’ (p. 14)7 ‘The elaboration of exact and predictive sciences –
arithmetic, geometry and
astronomy.’ (p. 14)8 ‘Conceptualised and sophisticated styles [of art].’ (p. 15)
9 ‘Regular “foreign” trade over quite long distances.’ (p. 15)
10 ‘A State organisation based now on residence rather than kinship.’ (p. 16)

The urban level of existence is the built and unbuilt domains of avenues, squares,
schools and local public buildings, a mixed level between the global and private.
Neighbourhoods can differ greatly in their ethnic, political, religious, and economic
dynamics; open spaces include a broad range of uses, from gardens to civic plazas to
empty lots. Innovation and experimentation brought humans to this. In doing so they
created for the first time a sense of public and private space, a sense of an 'inner' and
a 'public' existence as well as other distinctions such as ‘rich’ and ‘poor’, ‘weak and
‘powerful’. This was a highly significant landmark in the history of human social and
cognitive evolution, as these gave rise to particular cognitive categories and energies.
Homes dictated for the very first time a very real difference in the cosmology of
social, psychological and physical space (Wilson, 1988; Dovey, 1978; see below).

026463 318
Strange

Profane
Insecure

Familiar
Secure Sacred

Open Closed Inside


CENTRE Outside

Certain Passive

Culture

Strange
Doubtful

Nature

Fig. 1.1 Sets of dualities indicating the emotional/psychological tensions and boundaries between
home and the 'rest of the world' (after Dovey, 1978)

Dualities appeared between the 'inner' 'private' ' cognitive' world of the home, and the
'outer' 'public' 'social' world of nature, neighbourhood or the settlement. The private
is of lived experience, places of habiting and habitat where one can “dwell
poetically,” where meaning is supplied, but which is wrongly confined to a position
of subordination to the dictates of “superior” levels – that of the ruling elites and
classes. There are parallels with the implicit, sub- or un- conscious mind and
communications and behaviours and conscious or explicit expressions,
communications, and behaviours. These delineated spaces gave rise to quite
distinctive behaviours and rituals within and without the home – akin to what Erving
Goffman (1959) hits at in The Presentation of Self in Everyday Life. These public
and communal aspects of home suggested the ordering of space in the design of
further dwellings – emulation of habits made for the shaping of habitats. Mimicry,
plagiarism, vicarious and social learning - observing, retaining and replicating novel
behaviour executed by others. - and copying saw early humans developed a feedback
relationship among themselves as groups and societies, and individually with their

026463 319
hands, brains, and tools, evolving the capacity to externalize thoughts in the form of
shaped stone objects. When anatomically modern humans evolved a parallel
capacity to externalize thought as symbolic language, and a need for social
belongingness and status then cultural diversity became profuse. Most homes were
‘naturally’ sympathetic to the climate and other natural conditions, as well as
particular human needs and requirements dictated by daily utility and function, and
also accorded with wider cultural norms (Rapoport, 1969). The formations of
physical, behavioural and cognitive space helped to designate and inform the design
of space, and so our use of space denotes cultural and environmental predispositions
as well as ourselves as individuals (i.e. how hot or cool we like or need our homes,
how impervious to rain they are, how durable or secure they are, how they serve to
entertain guests and so forth. The reflexivity inherent in the co-shaping relationship
between people and the built environment is one of the most quintessential forms of
technical interaction and has been widely explored in the literature on architectural
theory where it crosses with the social science literature, especially anthropology.
Olivier Marc in Psychology of the House (1977) states:
"Are we perhaps on the verge of grasping that the environment is
ourselves, for it has given us form, and that creation is nothing but a
dialogue between the inside and outside? Do we not have to exhale
and inhale in order to live? . . . our unconscious self prompts us to act,
produce and do. It is through our action and their products that we
reveal ourselves to ourselves." (Marc, 1977: p.80)

This reflexivity, the tensions and dialectic between the categories home and public,
inside and outside, home and inner self has had a critical role in the evolution of
wider social and cultural identities and systems, a sense of time and place in the
world. It does as we developed to adulthood. Since early settlement, we have barely
evolved from a purely biological perspective. How we make, how we find, how we
appropriate the things we need to bring into the home, irregularly, as in the case of
mementoes and decorations, large purchases like televisions, beds, washing
machines, and daily such as food and newspapers, have, as suggested earlier,
changed through time. Roger Silverstone (1994) speaks dramatically of the process
of domesticating technologies which referring in general to how they come to be
appraised and are incorporated into the domestic space. He likens appropriating

026463 320
technologies to a kind of 'hunter-gatherer-like' process of; " . . . bringing in objects
from the wild: from public spaces of shops, arcades and working environments; from
factories and, farms, quarries." (p.98) He alludes to the fact that more than bringing
them ‘in’ from a chaotic 'outside' world, we possess particular resources to 'tame'
these objects into the patterns of our everyday life and activities, and this is central to
the theme of domestication over ages. More importantly perhaps, and via some
process of familiarisation and commonality, we make them usual, ordinary,
‘everyday and habitual – or not. We sublimate novelty into mediocrity and
environment, although the net result of their purpose and use is to stimulate, provoke
and excite amongst other active and explicit experiences. This certainly true of media
technologies such as radio, television, and hi-fi. Even object-de-art on our
mantelpieces and walls speak prominently of our individuality, our tastes and our
distinctions, and certainly to a new visitor to the home who enters and begins a
process of making sense of his or her surroundings.

Our sense of the world and the means by which we attain and present this sense
through the ad hoc design of our home, most certainly has evolved over time. How
we derive worldviews, how they are shaped, and how we communicate their contents
to each other have been privy to considerable intensifications over time (Wilson,
1988). For instance, the means to socialise, to entertain others and ourselves, those
attempts remove ourselves from the toil and stresses of everyday labour, in public as
well as private spaces, have changed radically within the last 100 or so years.
Ornaments and objects in the home made possible what Claude Levi-Strauss (1968)
described as bricolage- the process by which individuals and cultures use the objects
around them to reconfigure the boundaries of their cognitive categories. For
advanced western societies this inevitably means at least some electrical devices, and
most ostensibly, home entertainment and labour saving devices. The last category of
devices – microwave, washing machine, and vacuum cleaner have become more or
less a staple, emerging from the electrification of homes between the world wars
(Forty, 1986). Even the ancient technologies of pots and jars mediated and straddled
the categories of ‘inside’ with ‘outside’ – as they were used to store materials bought
at the market and stored until they are used in the home. Bric-a-brac and

026463 321
ornamentations relate to memories of vacations or trips, or aspirations to make trips
(i.e. a painting of a rural idyll or a foreign temple).

Sennett (1977)288 argued that in earlier societies identity was almost entirely ascribed.
A person was born to a class, occupation and role. If a butcher went on the street
inappropriately dressed for that role, he should be publically reprimanded. This
changed with the enlightenment and industrialisation, and new genres such as
theatre. When people appreciated that a person could act out identity as a theatrical
performance, the contrast was with an identity assumed to be authentic or real. Today
we think more in terms of chosen or multiple identities. An influential account by
Giddens (1991)289 examines identity as a constantly re-worked personal narrative
striving for coherence. Cohen (1994)290 emphasised individual creativity against
Durkheimian social conformity. The most resilient influence has been Goffman
(1975)291 who emphasised multiple identities equated with differentiated roles,
framed by context. In today’s society, at best, the reflexivity afforded by our
interactions with new objects and technologies helps define our identity, our sense of
community; it evokes in us new needs and requirements as well as directs the
avenues for innovation and improvement. 292 "Man is not only homo sapiens or homo
ludens, he is also homo faber, the maker and user of objects, his self to a large extent
a reflection of things with which he interacts." (Czikszentmilhalyi and Rochberg-
Halton, 1981: p.1) As Gordon Childe had said “Man makes himself” (Childe, 1936)
Much of the recent literature regarding consumer research has been of this reflexive
nature, pointing to the way we define our self- and public- identities, that this
definition largely comes from what we consume (i.e. Lunt and Livingstone, 1992;
Lunt, 1995) – ‘I am what I consume’. But while the products and services we

288
Sennett, R. (1977) The Fall of Public Man. New York: Knopf
289
Giddens, A. (1991) Modernity and Self-identity: Self and Society in the Late Modern Age.
Cambridge: Polity Press
290
Cohen, A. (1994) Self-consciousness: an alternative anthropology of identity. London
Routledge
291
Goffman, E. (1975). Frame Analysis. Harmondsworth: Penguin
292
As Margaret Wheatly (1996) has it:
“Every living thing acts to develop and preserve itself. Identity is the filter that every
organism or system uses to make sense of the world. New information , new relationships,
changing environments – all are interpreted through the sense of self. The tendency towards
self-creation is so strong that it creates a seeming paradox. An organism will change to
maintain its identity.” (p.14)

026463 322
consume may indeed define us, or conspicuous and even personal aspects of us, they
also come to be conditioned and shaped by the existence of other products, other
objects, including the designed and ordered environments in which we live ‘I am
what I consume which is all the other things I do and don’t consume’. All this taken
together suggests a symbolic order or what Jean Baudrillard terms ‘a system of
objects’ which form part of a circuit of references things, devices, values, and
identities which meld and blur and give rise to seismic shifts between the importance
of one aspect over another.293 In Baudrillard’s view objects appear to have a twofold
function. Apart from, of course, use, it is the aspect of possession that can make
objects slither their status from one extreme (the actively used, functional machine)
to the other (the passively possessed, collected item). Utensils, on the other hand, less
abstract than objects, cannot be possessed:
“A utensil is never possessed, because a utensil refers one to the world; what
is possessed is always an object abstracted from its function and thus brought
into relationship with the subject. In this context all owned objects partake of
the same abstractness, and refer to one another only inasmuch as they refer
solely to the subject. Such objects together make up the system through
which the subject strives to construct a world, a private totality.” Baudrillard,
p 85

Today, on a daily basis, most of us move between spaces of home, work and leisure
although still shadowing these distinctions is the spectre of home working. First it
was promulgated as an implicit inevitability of the rise of digitisation, and today it
hovers as a reality of the demise of the fossil fuel, and the death of the suburb. Most
of us consume a mix of information every day for a variety of purposes. As we
consume information we also develop purpose, it suggests to where we may move
293
In The System of Objects (1968/1988), Baudrillard explores the possibility that consumption has
become the chief basis of the social order. Consumer objects constitute a classification system and
have their effect in structuring behaviour. The object has its effect when it is consumed by transferring
its 'meaning' to the individual consumer. A potentially infinite play of signs is thus instituted which
orders society while providing the individual with an illusory sense of freedom (Sarup, 1992: p.161).
Baudrillard argues that participation with objects makes us think of the functions of the things in our
lives. In doing so, we reduce ourselves to functional beings. We make ourselves functional to the
things most important in our age, e.g. commercialism and technology. We are stripped of symbolism
and expression. We act only as a cybernetic consumer, responding to a market of signs. Consumers
attempt to create a fantasy life out of objects rather than human relationships but are constantly
frustrated by their shoddy construction and tendency to go out of fashion. While this seems removed
from concepts such as usability-as-a-quality Akiko Busch (1999) suggests otherwise: “If we were to
give up our cultural biases, would we find a coffee mug warmer or an ergonomically correct spring-
loaded ice cream scoop any less exotic than a silver pickle fork or cucumber servicing spoon?”
(pp.57-58)

026463 323
and how we may move on. We move between different roles in life, like shifting
television channels or browsing web pages, themes or ideas in a search engine. Some
roles in life are closely associated with geographical spaces, some of which are
provided for entirely by technology, as in chat sites and early form of online gaming
(i.e. Turkle, 1995).294 But the spacio-temporal boundaries that previously
distinguished and identified these spaces and roles in people's lives blur. Some of this
can be attributed to wider socio-economic shifts and trends, cut-backs, automation of
low-level customer care, outsourcing overseas, some of this concerns technology,
and in particular, the availability and uptake of information and communication
technology in global industry, at work and for domestic use. Internet-enabled PCs, i-
Tv and mobile communications join the telephone, newspapers, letters, and the
broadcast technologies of television and radio, to inform and entertain, and to permit
teleworking and other forms of bridging the spaces home, work and 'in-between'
environments. While the latter technologies brought information to homes and to
private individuals regarding activities and happenings in the 'rest of the world' - the
world beyond immediate sensory an social experience - the former are now
connecting private individuals, allowing them to interact with, albeit in designed and
limited ways, with the 'rest of the world'. They are able to do this from wherever they
are - at home or on the move.295 Television is now something one can interact with –
even on the move. Is this a radical form of innovation or better characterised as
incremental? Television, following the place and role and of radio, was originally
visualised as a domestic rather than mobile device, a social, familial technology
opposed to something you partook in an individual and solitary way.

In 1915, David Sarnoff (who most famously became Chairman of RCA) first foresaw
radio as a mass medium built around a broadcasting network. He later extended this

294
Sherry Turkle focuses on the individual, the user within his own and outside environment in
cyberspace. She states the question, "Are we living life on the screen or life in the screen?"(p.21) For
the large part, this all depends on the individual 'user', and how seriously the user takes the Internet as
a tool and uses it.
295
Consider a recent publicity campaign by Nokia the mobile phone manufacturers. Entitled
‘Nokiagame’ its catchphrase is “in reality it’s a game.” Advertised initially on television, ‘players’ had
to register on a web site and provide their mobile phone numbers. What followed was a kind of ‘paper
chase’ which entailed a truly cross-media experience, with players having to locate and decipher clues
provided by mobile messaging, web sites, newspapers or broadcast television.
http://www.nokiagame.com (9/10/00)

026463 324
to a vision of television. Here he proposed a technology that would rearrange "living
rooms everywhere," and extend not only the functionality, but also the experiential
impact of radio:
"Let us think of every farmhouse equipped not only with a sound-receiving
device but with a screen which would mirror the sights of life. Think of your
family, sitting down of an evening in the comfort of your own home, not only
listening to the dialogue but seeing the action of a play given on a stage
hundreds of miles away: not only listening to as sermon but watching every
play of emotion on the preacher's face as he exhorts his congregation to the
path of religion." (Quoted in Wheen, 1985: p.16)

Sarnoff’s passionate but deterministic scenario enshrines a domesticated vision of


technology; situated, embedded within the rich contexts of function, media content,
societal values, family, good tastes, emotional power and familiar programme
preferences. The technology and its contents, its 'messages' are integrated,
naturalised within the home, within family life and its goings on, negotiations,
stories and ritual. Indeed, Lyn Spigel (1992) points to the way in which television
featured as an integral part of 1950s modernity with its suburban home and family
organisation, all of which is so adequately captured American 'home economic' style
of magazines of the time. Adverts in such publication invariably conveyed the sense
of 'wholesomeness' or 'completeness' that a television would contribute to
normalising the modern home environment. Television was not on its own of course
the same was being done for automobiles. The environment of the home, as well as
the context of family life and familiar everyday routines, have each served as potent
archetypes for advertising scenario building since its very inception as an industry.
Not just home, but family life harmonised through ‘completeness’ provided by the
possession of technology, of cooking the right food to watch TV – including the so-
called ‘TV dinner’– a convenience meal which makes the task of cooking converge
into the experience of watching TV. In a system of many other domestic products,
devices and objects, the happy family provides very potent symbols indeed on which
to base the plausibility for new and prospective TV-centric technologies such as i-Tv.
It was very consciously being branded by its pundits as a 'lifestyle technology' –
something that was not simply aimed at providing entertainment or even information

026463 325
but moreover, embed and weave into the fabric of family, home and everyday life in
a central and relevant way.296

Contextualising, embedding and naturalising

Edward Hall (1989) has it that the term ‘context’: ". . . is frequently the most obvious
and taken-for-granted and therefore the least studied aspect of culture that influences
behavior in the deepest and most subtle ways." (pp.16-17) However 'contextualising',
'embedding', 'naturalising' ‘intertextualising’have come to be of considerable interest
recently within many areas, from the study of new media, to modern brand
development. Ashcraft and Slattery (1996) suggest, for instance, that successful
brands in the late '90s are those that 'embed' the values and experiences of customers
in products and marketing. Much in made in marketing discourse of ‘brand image’ -
the current view possessed by customers about a given firm and its products and
services. It can be defined as a unique bundle of innuendos, associations and
connotations held within the minds of target customers. Brand image develops and
conveys the product’s character in a unique manner differentiating it from its
competitor’s image. Brand attributes are the functional and mental connections with
the brand that the customers have. They can be specific or conceptual. Benefits are
the rationale for the purchase decision. There are three types of benefits: Functional
benefits - what do you do better (than others), emotional benefits - how do you make
me feel better (than others), and rational benefits/support - why do I believe you
(more than others).

Brand image is not created, as in a hypodermic model of advertising, but emerges out
of a system of experiences and interpretations populated, engendered and managed
296
Lefebvre's in Critique of Everyday Life (1958/1991), argued that everyday life has ceased to be a
"subject" rich in subjectivity: it has become an "object" of social organisation. He meant that
individuals have less control in self-realisation (subject) and adopt market categories to describe
themselves (object). Individuals objectify the self and themselves. The self has become, through
market categories, an object of market dimensions and dynamics. This is a simple psychodynamic
method of explaining the commodification of the consumer. More important, it sets in place a
mechanism that defines the individual from the outside, rather than from the inside. I put forward an
adaptation on this theme in Nicoll (1999) where I suggest that ‘users’ were ‘used’ as ‘currency’ in
enrolling the support of senior managers, external funders and partners, and ultimately, other
consumer-users. Access to the users, and more importantly the user research was commodified as part
of the package of learning (along with access to the technological system) afforded by the Cambridge
trial.

026463 326
by advertising and PR and previous exposure and wider societal values for the
product and service, but also rely on creative resources on the part of the viewer. The
brand image includes products' appeal, ease of use, functionality, fame, and overall
value. Brand image is actually brand content. When the consumers purchase the
product, they are also purchasing its image. Brand image is the objective and mental
feedback of the consumers when they purchase a product. Positive brand image is
exceeding customers’ expectations. Positive brand image enhances the goodwill and
brand value of an organization. The way something is packed and presented is in
every way as important as the quality of the object or experience itself. Luxury
clothing brands use top end materials in the making of their garments, while also
sitting retail locations in high-end parts of town and employing highly ingenious
interior and window display creative. Like the guild streets of medieval times which
concentrated professions together. Other competitive top end clothing brands will be
found in the vicinity, exacerbating the exclusivity of the shopping experience and
raising the general profile of the neighbourhood – and in turn the brand itself - in the
population’s eyes.

They are backed wholeheartedly by Pine and Gilmore (1999) who speak, not of
'digital' or 'information' economies, but now of the 'experience economy' – a trend
which is removing us ever further from the 'rationalist consumption' of the classical
economist (i.e. Miller, 1995). This is where experiences are paramount to customer
satisfaction and enrolment. This finds parallels in design where the widely cited
advocate of human-centred design technique, Donald Norman and his associates are
now coining the term user experience to encompass: "all aspects of the end-user's
interaction with the company, its services, and its products." They go on: "The first
requirement for an exemplary user experience is to meet the exact needs of the
customer, without fuss or bother. Next comes simplicity and elegance that produce
products that are a joy to own, a joy to use." 297 Such thinking represents a shift
towards people interacting with the entirety of, and not simply an aspect of, a
technological, service and fulfilment system. Such an aim requires contextual
thinking beyond the capacities and functions of one's domain as part of an overall

297
http://www.nngroup.com/about/userexperience.html

026463 327
social and object system.

Television

As suggested so far it is the contention of this thesis that television is no minor


phenomena. Viewed as technology, product or service, it is truly pervasive, global,
ubiquitous, shared and yet personal and private - a key domestic technology and yet a
window on the world beyond the porch, even beyond national boundaries, even
natural boundaries as the live broadcasts from the moon demonstrated. But not all
have taken this view. In the early 1980s President Reagan appointed Mark Fowler as
Chairman of the FCC spearheaded the deregulatory trend in telecommunications
policy with his now famous comment. "Television is just another appliance. It's a
toaster with pictures. Let the people decide through the marketplace mechanisms
what they wish to see and hear. Why is there this national obsession to tamper with
this box of transistors and tubes when we don't do the same for Time magazine?"298

It is chock full of pluralities. Its purpose can be easily described those who use it;
however assertions of what it is used for may be much more elusive. It is relatively
easy to describe that what it does is to serve to inform and entertain, but the manner
and means by which it accomplishes this is quite different to face-to-face forms of
tuition and performance. The way in which it developed, the way in which it
operates, the scale of its diffusion, all point to something more radical than even the
World Wide Web. It can be live, in the sense of relaying sound and vision from a
remote geographical location (televisual), and it can be recorded, endorsing and
embellishing the purely human performance with graphics, animations, and special
effects - as has cinematography since its inception. It formats its media contents in
unique and particular ways which need to be understood, as they extend exaggerate,
suggest and otherwise augment human performance. Television as a statement has
like the development of social norms and conventions its own codes captured in its
regularised forms and codes, such news announcers portrayed as professionals in
suits or national dress even at 6 a.m. Television as an experience absorbs people; it
298
Mark Fowler, Interview in Reason magazine, 1 November 1981

026463 328
introduces global ideas and brands like Manchester United football team and makes
and reinforces global celebrities. Its pervasiveness into human affairs, emotions and
dilemmas, their abstraction and depiction, into cultural and political arenas, the
issues and rationales presented are unprecedented. This has driven commentators
such as Jean Baudrillard (1983: p.55) to suggest a two way process of dissolution:
" . . of TV into life, . . . . of life into TV." And Stephen Heath claims that media, and
in particular, television, forms a, "seamless equivalence with social life."(1990:
p.267) Roger Silverstone (1989: p.77) argues that: "Television is everyday life. To
study one is to study the other." Such commentary goes a long way in emphasising
the primacy of the phenomena and experience which is television in its symbolic,
culture reinforcing place in human affairs, and how it serves as an index to all types
of social realities on may levels. Are these commentators exaggerating, or can it be
that a single domestic technology can be viewed as a reflection of the way we are,
who we are? To consider this properly it is perhaps best to consider the way in which
television’s situates in physical as well as symbolic and experiential senses within
our lives.
"One of the most potent symbols and vehicles of our current high-tech society
is the television set. The automobile, the aeroplane and radio have clearly left
no one's life entirely untouched, but it is arguably television that has affected
people's minds most deeply. Around the globe, the television set provides a
very literal window on the world outside, liberating at the same time the
inner, private world of the imagination. But is television truly a ghost from
the gods? Or is it a Trojan horse, coming into our homes as a deceiver . . . ?"
(Marzano, 1995: p.9)

Within the home, the television receiver sits within a wider behavioural and technical
continuum consisting of other consumer 'white' goods refrigerators, microwaves,
dishwashers, vacuum cleaners and so on. These technologies basically create a
gestalt, a system of objects, a bricolage of function, co-existing, sometimes
aesthetically, with furniture, décor and architecture to create, define and shape the
modern home – in other words they make it as ‘whole’ and ‘complete’. Each
technology has their place, each has distinctive functions, but they are consciously
purchased and used or not, or domesticated, or merely stored or on display. The class
of mediatechnologies – television, radio, hi-fi - remain quite distinctive in terms of
what we do with them, what we 'use' them for. While some domestic technologies

026463 329
may be read-off as that category of 'labour-saving devices', the mediatechnologies
have a less tangible, almost esoteric value and worth. They do not replace dustpan
and brush, they do not preserve food so we can buy in bulk. Mediatechnologies are
for passing time, for leisure, for information, for education, for 'keeping-in-touch', for
entertainment.

Television in this account represents something like an elaborate clock. It not only
informs and entertains, and mediates media messages and content; it also mediates
social and personal life, actions and space. It situates within the constituencies of
everyday life; this is why it is 'meta-object', and it is here that its centrality, a semiotic
centrifugality begins. First, we must consider that it resides in the home, our central
location in the world, the place where we first depart from to do our will in the
world, and ultimately to where we return to reflect when we are finished. We have
already introduced home in the sense that Dovey (1978) uses when he describes
home as the 'ordering principle in space'. And within the home, television is often to
be found in that central shared location the 'living' room, the lounge, where one
entertains and relaxes. And even within this room the television is often positioned as
its focal point, with all other furniture arranged to assure a clear line of view of the
screen from as many angles as possible. While televisions are to be found in
kitchens, a further 'hive' of communal household activity, and bedrooms for private
viewing and for game playing, it is the living room which traditionally is the place
where it is to be found. Having said this there is of course televisions to found in
bars, gyms and other public spaces, but what marks hotels as ‘homes away from
homes’ is the presence of television as a private and personal pastime.

Following the notion of television as an 'ordering technology in time, information


and space’ are yet further 'central' roles for it within our lives, as 'informant' even
'friend', babysitter, shop and temporal guide – a metronome through which to live by
- all key activities again situated within the daily routines of everyday life. A plethora
of surveys (see for instance Argyle, 1992) have drawn attention to the competition of
television viewing and shopping as the most prevalent leisure activities across all
demographic groups. It is because of television’s prominence, its out and out

026463 330
centrality in everyday life and practice and the shaping of consciousness that
television must be marked out as a special case for study. Television as technology
and cultural form (Williams, 1974) or as technology and medium (Silverstone, 1994)
opens entirely new vistas that extend in many directions past simple consideration of
qualities of products such as their 'usability'. There is considerable distance
between any notion of the ‘use’ of a PC for word processing, and the ‘use’ of
television for entertainment. As such the notion of usability – i.e. put simply
‘ease of use’ - between the two is also experientially quite different. What is the
usability of a good television series? How entertaining is writing a letter of
complaint?

In an attempt to better understand user acceptance, Davis and his colleagues (e.g.
Davis, 1989, 1993; Davis et al., 1989a, b, 1992) developed the Technology
Acceptance Model. The Technology Acceptance Model and its derivative (Igbaria
et al., 1994) has become the most comprehensive attempt to articulate the core
psychological aspects associated with technology use. Based on the generic model of
attitude and behaviour (the Theory of Reasoned Action, Ajzen and Fishbein, 1980;
Fishbein and Ajzen, 1975), the Technology Acceptance Model has proved a robust
and valuable model when considering information technology acceptance, or uptake
(Mathieson, 1991; Taylor and Todd, 1995). In short, Davis and his colleagues
(1989a, b) and Davis (1993) postulated that users’ attitudes toward using a computer
system consisted of a cognitive appraisal of the design features, and an affective
response to the system. In turn, this attitude influences actual use, or acceptance of
the computer system. The two major design features outlined by these researchers
included, the perceived usefulness of the system (operating as an extrinsic
motivator), and perceived ease of use of the system (operating as an intrinsic
motivator) (Davis, 1989, 1993; Davis et al., 1989a, b, 1992). Perceived usefulness
was defined as the ‘‘degree to which an individual believes that using a particular
system would enhance his or her job performance’’ (Davis, 1993, p. 477). Perceived
ease of use was defined as the ‘‘degree to which an individual believes that using a
particular system would be free of physical and mental effort’’ (Davis, 1993, p. 477).
It was argued that these two features formed the users attitude toward using the

026463 331
computer system, which in turn impacted upon actual system use. Thus, the more
positive the perceived ease of use and perceived usefulness of the system, the higher
the probability of actually using the system. Furthermore, Davis et al. (1989a, b) and
Davis (1993) also postulated that perceived ease of use had a direct impact upon
perceived usefulness, but not vice versa. Henderson and Divet (2003) in their
summary note that other key factors can impact user behaviour beyond perceived
ease of use and perceived usefulness of the system and they “still need to be
addressed.”

“… one must consider the nature of the activity under question (use of an
electronic supermarket), and the nature of the constructs measured (system
usefulness and ease of use). The key word here is system. Future research
should now focus on the examining the relative impact of perceived ease of
use and usefulness, in light of the product, service and system elements
associated with electronic commerce settings … future research [should also]
explore the factors associated with the possible mediation effect of usefulness
upon ease of use. Other attributes associated with the home shopping service,
including the quality of produce, or the price, were not addressed within the
current research. (p.393-394) 299

For instance, television as a means of marketing and selling product directly – i.e.
through TV-shopping channels such as QVC in the UK - or less directly - through
features and advertising – is well-established. But its ‘show and tell’ or ‘show and
sell’ linear format prevents people from shopping – i.e. browsing, searching, locating
the things, the product categories, they want, when they want them. Making
television interactive, networking home shopping and other services and melding this
with its basic function – that is to inform and entertain - opens some level of
'shopping' to take place – something akin to how it is practiced in the real world of
high streets and malls. For instance you are not only seduced into buying by what is
presented or shown, but can hunt through product categories – while you may not
feel the quality of the goods, or smell the freshness of the produce the experience is
similar to window shopping where you order good displayed in a window. In many
ways it highlighted the importance of brand, and brand image to not only suggest

299
R. Henderson, M.J. Divett (2003) Perceived usefulness, ease of use and electronic
supermarket use Int. J. Human-Computer Studies 59 (2003) 383–395

026463 332
desired lifestyles forest and consumers of a given product or service, but to suggest
and guarantee quality

However it also raises some immediate lines of question. For instance how does
conflating two very common place and prominent social activities - i.e. television
'viewing' and 'shopping' - enhance or extend people's experience of television as
medium, ordering principle, and as technology? How about the experience of
shopping, or indeed, an experience of home? Will such practice truly free up time for
other activities, those within and out with the home? Does it relate to the experience
that children had when they first connected video consoles to the television, or to
time shifting when VCRs first freed the viewer from the hegemony of the
programme scheduler? Which is the best way to produce advertisements and sell the
idea of products and products themselves through these new communicative
functions? While these crucial questions were not answered by the Cambridge Trial,
they featured as some of the questions which were certainly drawing partners to
explore the system as a medium and a technology with respect to their own certainly
interests.

In a very similar vein to Sarnoff's vision of broadcast television, pundits of i-Tv, such
as Om, made constant reference to i-Tv's 'revolutionary' and 'epochal' nature. They
expressed little reticence in propagating myths regarding its potency to radically alter
existing domestic practices. Such visions were perhaps made more readily digestible
at the public [and journalistic] level through a wider fascination at the time for
tangible symbols of new kinds of lifestyle extending into the new millennium and
beyond. But such potency is at its most powerful when it is presented as a whole, in a
wholesome fashion, as part of the family and homemaking, where social and the
technical elements are presented in an embedded, naturalised, and contextualised
fashion, couched in a semi-utopian rhetoric of design and social change, and
particularly, domestic change:
"Today many new technologies form part of, and develop as, nodes in a
network that has no central unit. Rather, all the units are linked together,
sometimes performing the function of requesting information, at other times
that of supplying services. Whatever the actual use is made of this potential
for connection, the fact remains that the experience of domestic technologies

026463 333
is evolving toward the image of the 'family', and that the ability to make
connections strongly influences the image of individual products and the
customer's decision to purchase them. In both real and symbolic terms the
television set stands at the centre of this demand for interactivity, becoming
the prime product of the entire system of connection." (Morace, 1995: p.15)

Following Morace and the other commentators on television, including Silverstone,


Sarnoff, Baudrillard and Williams which I have cited so far, a clear picture is surely
emerging of a technology which possesses vast symbolic as well as functional power.
A technology which is very common, whose ubiquitous presence makes it all too
familiar, and it may be precisely this familiarity that breeds a negligence in wider
scholarly studies of television and its social and cognitive influences. Many funded
studies have been relegated to those which seek out negative influences – the so-
called ‘effects’ studies (which I will deal with later). Many other studies have
emerged however taking a more sociological/anthropological position to media and
its technologies in the home.

The original proposed – neo-positivist study as detailed in this thesis came to be


undermined as a sympathy developed to the work of media and media technology
researchers, particularly Silverstone, Hirsch and Morley (1991) in their study of the
use of mediatechnologies in London households.300 Theirs was a problematic study
methodologically which in essence opened the 'black box' of the complex social and
psychological exigencies that constitute the category ‘home’ and how they
propagate, abate and otherwise define domestic ICT use. This early study set the
scene for a series of influential studies (EMTEL etc) through the 1990s, and most
importantly drew attention to the distinct methodological complexes in approaching
the study of technology use, in everyday life, in private modern domestic spaces in
advanced societies.301 Ethnographies in the orthodox understanding of the idea, has
300
Their work was itself influenced by earlier work of David Morley – particularly the Family
Television: Cultural Power and Domestic Leisure (1986) – Sandra Wallman’s Eight London
Households (1984) – and Bott’s (1957) Family and Social Network. These studies focused in depth on
relatively small samples of households. The Silverstone, Hirsch and Morley sample was also
relatively small (n = 20) as was the sample of trialist households in the present study (n =16).
301
The Fifth RTD Framework Programme of the EU is presented under the heading Creating a user-
friendly information society. It has as its main objective the realisation of “benefits of the information
society for Europe both by accelerating its emergence and by ensuring that the needs of individuals
and enterprises are met”. The European Media, Technology and Everyday Life Network (EMTEL)
addresses this agenda directly. EMTEL is a research and training network of European social
scientists investigating the social dimensions of the Information Society in Europe.

026463 334
the researcher going to a radically different and alien cultures from their own, where
notes are made regarding observed rituals and behaviours, patterns of communication
and so forth. The unobvious and obvious, the noticeable and unnoticeable, the seen
and unseen, heard and unheard, is different between visiting the distant and alien
against visiting that of the local, ‘familiar’, environment. Such studies can change
into tacit explorations of ‘privacy’ rather than a study of what happens when a
researcher is present and observing within the home.

Theirs was a difficult study which turned many pre-conceptions regarding the much-
practised activity of consuming and using media and media technologies within the
home. It raised questions, as this study has, regarding methodological approaches, as
much as what people do with media, and how they consume media content. A later
contribution entitled Consuming Technologies: Media and Information in Domestic
Spaces (1992) edited by Silverstone and Hirsch was a compendium of essays by
various authors which emphasised the view of information and communications
technologies as social and symbolic as well as material objects in the understanding
of their use value in the domestic space. The notion of the ‘use value’ of a
commodity derives from the writings of Marx where it is defined as its “utility to
satisfy human wants and needs, in any form they may take.” (Marx, 1906; p.42) 302
Use value is conditioned solely by the physical properties of the commodity
independent of the amount of labour required to produce it. Water as an example is a
generic human need, its use value is the same if it is found freely falling from the sky
and collected in a barrel, pumped into a house via a water mains and all its systemic
apparatus, produced by a desalination plant, or carried by foot across the desert. Its
use or utility remains the same, it is drunk, or used to wash things, for cooking etc.
Use value contrasts with exchange value in that the latter is dependent upon how
much labour [or technology] is used in its production. Something which uses much
labour typically sells for more money. However, money is the ultimate commodity; it
has virtually zero use value and exists wholly for the purposes of exchange.

[http://www2.lse.ac.uk/media@lse/research/EMTEL/]
302
Marx, K. (1906) Capital: A critique of political economy Engles, F. (ed.) New York Random
House

026463 335
These studies came to have significant bearing upon the orientation, as well as the
treatment, of the notion of usability in the present study with regards to both
traditional as well as interactive media technologies.

Interaction

We must consider 'interaction' as the chief functional characteristic not only adding
to, but defining the new forms of delivery mechanism for digital media,
communication and networks. But we must also consider how the engineers consider
command of technical propensities and prospective human deliberations. We must
also consider, which we will later, how this rationale process is conditioned
managerially. But what do we understand by the term ‘interaction’? I have already
drawn attention to the fact that interaction has much more essential and pervasive
facets, that there are reflexive shifts of meaning and focus between the constellation
of things, values, thoughts and behaviours that populate daily existence. Winston
Churchill said in 1943 “First we shape our buildings, then they shape us” Stewart
Brand paraphrased this by saying “First we shape our buildings, then they shape us,
then we shape them again-ad infinitum” That there is this process of domestication,
which renders things transparent including their problems, makes them tame. If
considered in its broadest, holistic sense - as a cognitive, sociocultural, as well as a
technical form of human process - interaction with objects and environments
continually shapes the lives of individuals, as well as the civilisations and cultures to
which they belong, from pre-birth to death (Wheatly and Kellner-Rogers, 1996). As
the maxim goes; ‘no man is an island’ – that is some isolated closed system. If one
were lost on a desert island one would still have to interact with nature in some
adequate way to survive. Eventually in a kind of perceptual Darwinian battle, only
those aspects of the desert island environment which mattered to drinking and eating
would be noticeable, to begin with it would all be a haze, like the uneducated eye
staring onto the microscope not knowing it was looking at a deadly virus. Defoe’s
Crusoe is shipwrecked on an island and, after overcoming his initial grief, he
manages to build a home for himself using only what he has on hand. Most of the
story revolves around Crusoe’s solitary life on this island, as he starts over with
virtually nothing. During this journey, he begins to realize that the things he once

026463 336
valued really don’t have any value at all. He finds money in the shipwreck, but he
quickly realizes that the money is worthless since it doesn’t help him solve his
problems. He finds that he has very negative views of some of the natives on the
island, but, again, he finds that those negative views do not help him survive on the
island. Instead, most of the book is about Crusoe constantly questioning what he
needs and what he actually values.

The island becomes a selection environment where eventually – we would hope –


the organism would learn what is good for it and what is not. Learning this would not
forgo the prospect of finding something new and possibly more nutritious and
convenient. But in order to learn we must first interact, try, test and taste. Some
commentators, such as Jean Baudrillard, have been damning with respect to the
future of interaction, predicting rather a future obsessed with order, equality and
reality, the end result being the death of seduction and an all-encompassing
neutralization of our society. The ‘death’ Baudrillard refers to is not a physical death
but a social and psychological stagnation of creativity, curiosity and interaction.

From this position individual and cultural development, all knowledge development,
owes all to interaction. At every given moment in time, actions and reactions occur
at many different levels of neurological, biological, physical, environmental and
social systems. From the biophysical changes in one's body and brain to the mental
processes involved in interpretation of our environment, to the meshing of social and
technical elements within some manufacturing processes that give rise to a product
that comes to be desired – all are subject to communication and interaction – what
Baudrillard has described as the “ecstasy of communication.”

We no longer partake of the drama of alienation, but are in the ecstasy of


communication. And this ecstasy is obscene. Obscene is that which
illuminates the gaze, the image and every representation. Obscenity is not
confined to sexuality, because today there is a pornography of information
and communication, a pornography of circuits and networks, of functions and
objects in their legibility availability, regulation, forced signification, capacity
to perform, connection, polyvalence, their free expression. It is no longer the
obscenity of the hidden, the repressed, the obscure, but that of the visible, all-
too-visible, the more visible than visible; it is the obscenity of that which no

026463 337
longer contains a secret and is entirely soluble in information and
communication 303

The open nature and magnitude of selection environments such as the internet have
not only given rise to search engines, but whole industries geared to optimise that
their product, their website will be found in the Malthusian glut. Finding things
relevant enough to interact with, to transact with, becomes the problem. 500 channel
television becomes a potential nightmare of looking for and finding, rather than
viewing and enjoying.

Iterations, reflections, paradigm shifts, or 'bifurcations' are the product of intelligent


forms of interaction.304 For instance in the philosophy of Merleau-Ponty (1962) there
is suggestion that our skills are acquired by dealing with things and situations and in
turn they determine how things and situations show up for us as requiring our
responses. Also, in Michael Polanyi's theory regarding tacit knowledge (see Polanyi,
1962, 1966) he describes how individuals develop and use knowledge in a process
which is at once action-oriented and also focussed on the process itself. But it is
through our interaction with technology that the roots of innovation in design and
innovation of use occur. Whereas purpose can be designated at the planning stage,
use has to be realised by doing, that is, it is realised through activity. This is more
than interaction with material, say, in a crafting sense - interaction with technology
303
Jean Baudrillard (1988). Ecstasy of Communication. New York: Semiotext(e). p.22

304
Manuel De Landa (1997) speaks of a ‘thousand years of non-linear history’. His book, inspired by
the work of the physicist Arthur Iberall (1972, 1987), looks beyond prevailing attitudes that have
history relying on texts, discourses, ideologies and metaphors. Rather it considers the interplay
between three domains that have shaped human societies – economics, biology and linguistics.
Bifuractions’ are similar to ‘paradigm shifts’ as they represent “when a system changes from one
stable state to another . . . minor fluctuations may play a crucial role in deciding the outcome.” (De
Landa: p.14) What is interesting here is the possibility of small stimulus-big response typical of
chaotic and complex systems. Such an idea placed in the realm of design and planning is suggestive of
what often thwarts forecasts and predications – interactions of apparently unrelated aspects of the
environment of a technology or the belief system produced through and around its use:

“Non-linear creativity is a major source of unknowability, in the sense of knowing based on


linear reductionist traditional science. Innovating complex technologies may feature a
dynamic in which heretofore distinct sectors fuse and spin off entirely new families of
technologies in unpredictable ways.” (Rycroft and Kash, 1999: p.21)
Here is a pragmatic argument for the use of interpretist approaches to the study of technology.
Sometimes seemingly irrelevant events that somehow draw the researcher's attention either because
they are striking or because of their frequent occurrence, may later turn out to be quite important
(Roche, 1973).

026463 338
inevitably means an interaction with designed or planned features and functions. As
Alberto Melucci (1989) has it: "Changes in everyday experience not only generate
but also reflect new needs in the lives of individuals." (p.114, my own italics) And
shifting more specifically to technology Neil Postman (1993) states that:
"New technologies alter the structure of our interests: the things we think
about. They alter the character of our symbols: the things we think with. And
they alter the nature of community: the arena in which thoughts develop." (p.
20, my own italics)

So in the context of using – as activity and experience - what we are saying is that
only then, through interaction, can we decide on iterations. In the context of "a
phenomenology of human-machine relations", Don Ihde has analysed the selectivity
of technology, arguing that human experiences are transformed by the use of
instruments, which amplify or reduce phenomena in various ways. As he put it:
"Technologies organize, select and focus the environment through various
transformational structures." (Ihde, 1979: p.53) Prior to Ihde, Marshall McLuhan and
Harold Innis had already explored the selectivity of media, although their focus had
been primarily on the social "effects" of various media of communication. Innis
argued in The Bias of Communication (1951) that each form of communication
involved a bias in its handling of space and time (see Carey, 1985). And McLuhan,
most famously in books such as The Gutenberg Galaxy (1962) and The Medium is
the Massage (McLuhan & Fiore, 1967) asserts that the use of particular media
massages human "sense ratios" (allusions to which are also found in Innis). More
recently, Neil Postman has reinterpreted McLuhan's aphorism that "the medium is
the message" as meaning that: "embedded in every tool is an ideological bias, a
predisposition to construct the world as one thing rather than another, to value one
thing over another, to amplify one sense or skill or attitude more loudly than
another." (Postman 1993, p. 13) Now this bias is only realised, or even understood in
a design sense by using, this is why many technology implementations produce
unwarranted or unexpected artefacts, or have serious gaps in their operation or
functioning.

Interactions breed emergent properties and phenomena such as expertise, or can


highlight faults in designs (i.e. Petroski, 1994). Faults or breakdowns are where

026463 339
representation and anticipation (of function) fail to be actualised. This suggests a
kind of view such as that raised by the historian of science Karl Popper when he
argues that learning from experience is not by positive but by negative instances.
Popper has it that engineers have learned more about design from failures rather than
successes.305 His examples are large physical engineering structures that have
collapsed. Their failure essentially highlights flaws in design and design thinking.
Clearly, before the institution of engineering and associated disciplines such as
architecture taught in Universities, trial and error mixed ad hoc with basic scientific
knowledge of physics in the design of structures. Science and past experience –
expertise - helped pre-empt the success or failure of designs. But sharing of
knowledge of failure and its reasons was by and by limited by lack of formalisation
and communication.

Similarly, knowledge of why designs succeed or fail in the marketplace tend to be


unknowns unless some form of consumer-user feedback study is done, or producers
can be truly objective and apply rigorous reflection and self-criticism about their
products. Poor interface design can entail customers ‘voting with their feet’, that is,
they may simply not use the product, or they may seek out a competitor’s, or even
dispel with ever using a entire brand. Products communicate most strongly through
their use and functionality. The power of word of mouth is undeniable in the
diffusion of new media products, and likewise it can also act to disfavour a poorly
designed product. The received wisdom from usability and HCI studies is that it is
better to identify and fix problems at the early design stages than to move forward to
manufacturing and distribution where fixes could be impossible if not very, very
costly.

Manuel Castells (1996) views that the differentiating feature of the 'networked
society' lies in the unique forms of interaction of knowledge, innovation and use
which are now enabled by the new communications technologies and the thinking it
promotes:
"What characterises the current technological revolution is not the centrality
of knowledge and information but the application of such knowledge and
305
This view is also taken up by Petroski

026463 340
information to knowledge generation and information
processing/communication devices, in a cumulative feedback loop between
innovation and the uses of innovation." (p.32: my italics)

But it is not just the structure of knowledge episteme which will be generated in the
new world where networked and intelligent technologies continue to meld with the
fabric of everyday life. Commentators such as Tatsuno (1993) point to the new
imperative for firms of all types to get 'closer' to – interact - those who will consume
and use their product and services. "There is no substitute for interaction with, and
the study of, actual users of a proposed design." (Norman, 1988: p.155) The need for
a ‘deep’ knowledge of use and users is emphasised when a firm innovates products
which claim to be 'smart', 'intelligent' or even 'networked'. This is also emphasised
when firms claim to be explicitly engaged in product design which aims to “create
new ways of doing business” by 'opening up new lifestyles' for consumers:

" . . . the multimedia revolution and the global information highway open
up new opportunities for totally new business and consumer lifestyles"
Peter Bonfield Chairman and Chief Executive of ICL
speaking on the Cambridge Trial at its launch

This is because these technologies narrow the window that designers have to
speculate regarding both use and users.

Cultures and spaces of production and cultures and spaces of use

What has been put forward so far is that there will always be a complex interplay –
interaction - between visions of use (and the users) of products, and how they come
to be actually realised (or compromised) in manifest products (by 'real' consumer-
users acting in everyday life and work). This process of actualisation contributes to
forming distance between what may be termed as cultures and spaces of production
and cultures and spaces of use.306

306
The collapsing of products and services is related by Kantor (1992: p.9-10) “Producers think that
they are making products. Customers think they are buying services . . . Producers think their
technologies create products. Customers think their desires create products . . . Producers organise for
managerial convenience. Customers want their convenience to come first . . . Producers seek a high
standard of performance. Customers care about a high standard of living” The distinction of
technology, product and service collapses in the information age. In this book I tend to use the term
technology, product and service interchangeably.

026463 341
“…humanists imagine the world as if what they did was, or ought to be, the
most important thing in it; technologists, reciprocally, imagine the world as if
what they did was, or ought to be, the most important thing in it. Both
imagine that the part they work on is the key part, and the other parts can be
thought on the same pattern, and moreover, should be. Humanists imagine
technologists are a bunch of misogynistic apes who need sensitivity training.
To caricature: Technologists think politics is a broken system that can be
fixed with a quick hackathon … Let’s assume, for the moment, that both
sides are right in this. Tech workplaces do have a gender issue; politics is a
broken system. How do we negotiate between the two worldviews? It’s
probably a fool’s errand to look for a grand synthesis. And in any case, from
where would such a synthesis begin? There’s no end to imperial claims from
one side or the other to explain the other better than the other knows
themselves.”307

Interaction effects and emergence

Buchanan et al. (1996: p.3) view that human beings, through the agency of design,
transform not only their own immediate lifeworlds, but also "impose a culture-
sustaining order on the chaos of experience." And because cyberspace is comprised
of bits and electrons, we need to consider this as we impose structures upon its
architectures. Fiske’s (1989: p.1) definition of culture as the: "constant process of
producing meanings of and from our social experience," once again emphasises how
our worlds are ordered, partly due to our conditioning, culture, organisation,
education and outlook, and partly by the built environment and through the mix –
interaction - of each or all of these elements in our thoughts and behaviours. This
includes the technologies, products and services that we may rely upon and use every
day. But it is also by the social milieu of institutions, workplace, family and friends
that form individual social constituencies. The claim of communications
technologies to pervade (or even invade) the realm of our personal and private
worlds and social networks is the claim that they are becoming ‘lifestyle’
technologies – technologies which help us order and augment the way in which we
307
McKenzie Wark, Against Social Determinism In Vol. 1, Iss. 1 — Inaugural Edition (Fall 2013)
source: http://www.publicseminar.org/2013/12/against-social-determinism/#.UymHPaiSyoM

026463 342
order our lives. In order to do so they must be generic enough to cope with everyone,
their tastes and distinctions, and at the same time, be flexible enough to cope with
our idiosyncrasies as consumers, friends, lovers, parents and club member (amongst
other roles).

Culture is often conceived as a multi-level construct, which Edgar Schein’s defined


as:

“... a pattern of basic assumptions, invented, discovered, or developed by a


given group, as it learns to cope with its problems of external adaptation and
internal integration, that has worked well enough to be considered valid and,
therefore is to be taught to new members as the correct way to perceive,
think, and feel in relation to those problems.” (1991: p.111)

In an earlier work Schein (1985) conceptualised and described culture of comprising


three levels or layers. These range from the obvious and concrete to the more subtle
and abstract:
1) Artefacts and creations are manifestations of:
2) Values, which in turn are engendered by:
3) Basic assumptions.

Artefacts and creations comprise the most visible level of culture and include the:
"constructed physical and social environment . . . physical space, technological
output . . . written and spoken language, artistic productions, and . . . overt behavior."
(p.14) They are the concrete and obvious personas of design and planning. They are
open to critiques and reconstruction, they can be reversed engineered whether they
are drugs being copied by an Indian plant, a new playstaton by a Chinese factory or a
recipe by a half decent pastry chef.

Shein’s second 'values' level is generally said to possess both conscious and
subconscious facets. They are distinguished;
" . . . .by goals, ideals, and standards that represent members' preferred means
of resolving everyday problems . . . socially shared rules and norms
applicable to a specific context . . . as well as what 'natives" perceive as
constituting boundaries of acceptable behavior." (Mohan, 1993: p.16)

Of special interest to social and innovation research are verbal artefacts and various
kinds of rhetoric these include language, stories, and myths as well as behavioural

026463 343
artefacts such as rituals and ceremonies (Mohan, 1993: p.16). As Winograd and
Flores put it:
"We create and give meaning to the world we live in and share with others.
To put the point in a more radical form, we design ourselves (and the social
and technical worlds in which our lives have meaning) in language." (p.78)

The social theorist Michel Foucault viewed history not as a linear progression but a
battle of ideas. For example, the development of the internet was not an unalterable
outcome of the mosaic browser; rather it was an idea that won out against other
possibilities. Foucault argues that the ideas that win out over others are made to
appear inevitable to establish their credibility. For example, if Microsoft had started
publicly suggesting alternatives to the internet, or it consciously packaged its own
browser with its operating system to the commercial detriment of a commercially
browser, it would damage that particular idea’s value because it would suggest that
other credible, cheaper, options exist. According to Foucault, statements are the
building blocks of discourse as they provide context and relate to one another.
Foucault believed that a small number of statements make up most discourses and
are repeatedly referred to. Foucault argues that institutions follow rules and
procedures that provide a particular set of results, or what he calls”games of truth.”
Foucault isn't claiming that institutions fabricate research. He's highlighting their use
of specific research methods to provide answers. Foucault calls this production of
knowledge “discursive formation.” This knowledge is then used to justify the actions
of social institutions.

The important development that this lends to emergence is the notion that one
designs themselves through the use of language. This invariably suggests a social
dimension, as language is developed through social processes and communication is
a social act and feedback about oneself is important in improving one’s ability act
and influence in the world.

While these concepts normally prompt certain behaviours, they may remain only
"espoused" (Argyris & Schon, 1978), in which case there is a discrepancy between
what the organisation and/or its members claim to value and in how they actually

026463 344
behave. This suggests that there can be disparity between words and actions. This
idea surfaces in many forms, but a pivotal study be Iuso (1975) in the literature on
consumer research indicates how product concept testing, where people are asked to
imagine the desirability of certain conceptual products (i.e. those suggested by words
– descriptions) have been shown not to accurately predict market success.

Schein’s most abstract level, that of taken-for-granted reality, includes the group's
basic assumptions, or; " . . . tacit beliefs members hold about themselves, their
relationships to others, and the nature of the organization." (Mohan, 1993: p.15)
These assumptions underlie and determine "meaning systems" in the organisation. It
is upon this "layer" (e.g., these assumptions) that the cultural infrastructure rests
(Deetz and Kersten, 1983).

Such definitions of consumption, culture and design suggest that the environment,
the aims and the contexts of the workplace, the place in which a technology is
planned and developed, will clearly be very far removed from the actual
environments where it will be either sold, appropriated or used (Araya, 1995). More
than simple physical and geographical distance, this distance is cognitive, cultural
and most importantly, experiential. The contexts and logic that give rise to particular
features and functions that distinguish one technology or service from another, may
vary considerably from the logic and contexts which motivates and particulate
appropriation or shapes the experiential dimensions of another. It most certainly
differs from that of consumption and usage.308

Several questions arise from the dialectic of design/production and consumption/use.


The first concerns technology. To which extent is a technology a continuation of
what came before, and how does this come to impact upon the processes of
innovation and diffusion? To what extent can consumer acceptance be determined in
the case of a radically new technology? Is there such a ‘thing’ as new uses, or even
new technologies?

308
From the customer’s standpoint a product is nothing more than a tangible means for getting a
service performed. Is baking soda a cake ingredient or an odour eater? The answer may be either or
both since the products derive their meaning and value only from the uses to which customers put
them.

026463 345
Innovation

The point I wish to raise here is that current frames of reference and comparison
always bind us and make us 'path-dependent' or 'tunnel visioned' one way or another.
The notion of ‘boundary critique’ is based on Churchman's (1970) argument, "that
what is to be included or excluded for any analysis of a situation is a vital
consideration"309.They do this while being vague and difficult to pin down and clearly
identify. This is why, as shall be argued later, that neo-positivist and reductionist
research methods do not lend themselves to the study of culture. 310 As with most
symbols, concepts or category systems they help constrain and focus attention.
Manuel De Landa cites perhaps the most recursive question arising within the realms
of human creativity. Can we really ever create something new, particularly when
emergence can only arise from evolutionary processes?
"A key issue in the philosophy of technology concerns the most appropriate
way of conceptualizing innovation. One may ask, for instance, whether
human beings can truly create something novel, or if humanity is simply
realizing previously defined technological possibilities. Indeed, the question
of emergence of novelty is central not only when thinking about human-
developed (physical and conceptual) machinery, but more generally, the
machinery of living beings as developed through evolutionary processes. Can
anything truly different emerge in the course of evolution or are evolutionary
processes just the playing out of possible outcomes determined in advance."
(De Landa, 1997: p.31)

The very existence in the public domain of existing products, services or ways of
doing things marks them as what could be considered 'departure points' for the
ideation of improvements or alternatives - what can or will be in the future. Clearly,
technologies that already populate the high street suggest general trends and
individual technologies manufactured and released to the general public can be
reversed engineered by competitors seeking to make more efficient and competitive
products.311
309
Churchman, C. W. (1970). Operations research as a profession. Management Science, 17, B37-
53.
310
This issue will be dealt with in more depth later, but it is wirth indicating here that conducting
experiments or using statistical methods do have a place in an overall study of culture. Especailly
where issues of frequency or amount are deemed to be important (such as system logging remote
button presses).
311
This was true in the case study detailed later. Om had several machines including a Sony
Playststion, which was opened in order to reverse engineer the logic of robustness. The early STBs
designs lacked robustness.

026463 346
With reference to biological evolution, George Herbert Mead states that; "the
organism . . . is in a sense responsible for its environment." (1934: p.130) He adds:
"When there is [a] relation between form and environment, then objects can
appear which would not have been there otherwise; but the animal does not
create . . . food in the sense that he makes an object out of nothing. Rather,
when the form is put into the environment, then there arises such a thing as
food. Wheat becomes food: just as water arises in the relation of hydrogen
and oxygen." (ibid: p.333)

Mead provides here the basis for a temporal and interactive vocabulary to be
developed and one which can encompass mental as well as somatic phenomena in
the instances of innovation, development and emergence. Such a view applied to the
marketplace would view all available products, functions and features as the context,
or ‘textbook’, for new designs. Once a product is in the market place, it becomes an
immediate departure point for improvement, if not by the producer himself or
herself, then by some competitor or another who will improve upon cost, function or
quality.312 David (1975) is one of a number of authors suggesting that continuous
pressures from market forces have long been an incentive for technological change.
In certain cases, particularly if one is trying to establish a standard, a brand, or a
network, this is a desirable state of affairs (Kelly, 1997). 313 There are many ideas
regarding what constitutes an innovation. In Capitalism, Socialism and Democracy
Joseph Shumpeter first described his idea of creative destruction - a process where

312
I think here of Om engineers who often opened VCRs and games consoles of other manufacturers
to discover which components and layouts they were using.
313
In many respects this is a common practice of networked-oriented businesses. Mobile phones,
STBs, and other technologies are given away in order to develop networks, and then ‘lock-in’
subscribers. Robert Metcalfe, founder of 3Com Corporation and the designer of the robust Ethernet
protocol for computer networks, observed that new technologies are valuable only if many people use
them. Specifically, the usefulness, or utility, of a network equals the square of the number of users, a
function known as Metcalfe's Law. The more people who use your software, your network, your
standard, your game, or your book, the more valuable it becomes, and the more new users it will
attract, increasing both its utility and the speed of its adoption by still more users. If you and I can call
only each other, to return to the telephone example, a phone is of little value. But if we can call nearly
everyone else in the world, it becomes irresistible (Kelly, 1996). For the phone system, or the power
system, the initial investment in network infrastructure was high, which kept the price of access high.
In the case of railroads and telephones, initial developers failed to appreciate the value of
interconnection (in essence, the power of the Metcalfe curve). Railroads struggled with multiple
gauges of track, which limited connections between systems, until the late 1880s. It didn't even occur
to telephone companies to put a dial on the phone until 1931, even though the high cost of employing
people as switchboard operators limited the reach of the network. In the predigital age, Metcalfe's Law
could take decades to unleash network power.

026463 347
“The opening up of new markets, foreign or domestic, and the organizational
development from the craft shop to such concerns as U.S. Steel illustrate the same
process of industrial mutation—if I may use that biological term—that incessantly
revolutionizes the economic structure from within, incessantly destroying the old
one, incessantly creating a new one. This process of Creative Destruction is the
essential fact about capitalism.” (p. 83)”.

Rogers (1983: p.11) suggests that an innovation is "an idea, practice, or object that is
perceived as new by an individual or other unit of adoption." Rogers offers here an
essentially subjective view of innovation – i.e. its novelty exists in the mind’s eye of
the beholder, in this case the consumer-user. And from a consumer-research
perspective Foxall and Goldsmith (1994: p.531) view innovation as a brand, product,
idea, service, or practice that is; "perceived as new in the eyes of the members of a
social system." There is a more objective dimension to innovations if viewed
technologically, when it resides in the product's characteristics, those qualities that
differentiate it from others in their class (Garner, 1978). Shifting from technology
back to the human perspective it can also be ‘objectified’ in terms of the behavioural
changes which are required to use the product (Robertson, 1971; Engel et al., 1990);
or even in terms of the characteristics of the evaluation task that is required on behalf
of consumers (Howard and Sheth, 1969). Hirschman (1981) also introduces the
notion of a 'symbolic' innovation – this is where a different social meaning is created
for an existing product. New wrapping for old technologies is often heralded under
the notion ‘re-branding’. Another view would most likely supportive of the view of
customer-led innovation. Shifting from micro to macro sociological perspectives
Cove and Svanfeldt (1992: p.305) have also described the notion of a societal
innovation as the "result of an encounter where the culture and competence of the
firm perfectly match the current status of society." Clearly there are many definitions
of innovation depending upon what one is addressing – social, technical or
organisational phenomena.

Abernathy and Utterback differentiated conservative vs. radical innovations

026463 348
incremental from radical innovation already in 1978, 314
while Porter in 1986
illustrated a similar concept called continuous and discontinuous technological
changes.315 We also had authors defining Incremental vs. Breakthrough innovations
(Tushman and Anderson) 316
and Conservative vs. Radical innovations (Abernathy
and Clark)317. From an industrial perspective innovation is a term which represents all
the activities of bringing a new product, process or service to the market (Clipson,
1991). Such activities usually involve the generation, acceptance, and
implementation of these products, processes and services (Thomson, et al., 1969).
Nyström (1990) shifts to the cognitive arena when he sees it as 'bringing new ideas to
use'. Innovation, then, as addressed in the multitude of definitions, is clearly a
process tightly bound with the concepts of use and utility. However, it does vary
depending upon if one is a producer or products, or a consumer-user. It does not
appear as something intrinsic to a product.

Robertson (1967, 1971) delineates three types of innovations: the continuous, the
dynamically continuous, and the discontinuous. The continuous innovation causes
little disruption in behavioural patterns and involves the introduction of a
modification on an existing product, while a discontinuous innovation is a new
product that requires the establishment of new behaviour patterns. Dynamically
continuous innovations lie somewhere in between. Henderson & Clark (1990) define
a radical innovation as one which changes both the "core concepts" (e.g. components
or basic technologies) and the linkages between these concepts (that is, the
architecture of the product).

In many respects the position adopted here would question whether there are such
things as 'radical' or 'discontinuous' innovations318, preferring instead to advance the
314
Abernathy, W.J. and Utterback, J.M. – Patterns of Innovation in Technology, Technology Review
1978
315
Porter, M.E. (1987) "From Competitive Advantage to Corporate Strategy", Harvard Business
Review, May/June 1987, pp 43-59.
316
Michael L. Tushman; Philip Anderson. Administrative Science Quarterly, Vol. 31, No. 3. (Sep.,
1986), pp. 439-465
317
Abernathy W. J., Clark K. B. (1985) “Innovation: mapping the winds of creative destruction”, in
Research Policy, n. 14, pp. 3-22.
318
Henderson & Clark also define a further category - architectural innovations. If a radical
innovation changes both the "core concepts" (e.g. components or basic technologies) and the linkages
between these concepts (that is, the architecture of the product), the architectural innovation "only"

026463 349
notion of emergence. Most technologies have qualities that relate them to established
functions and features of other products, and already established practices. At the
point of use, interaction denotes the juncture where nature or the designer enters the
worldview or environments of the user, the sites of implementation, and the user
reflexively enters the worldview or environments of nature or the designer or the
service provider. A key question in the present study is; how much of this relation
between design and production and use and consumption is it possible to discover
through simulacrums, representations, trials and prototypes? They are contrived, to
an extent planned and logically assigned and organised. They are not emergent, but
rather planned and strategic but nevertheless still aimed at elucidating relevant
feedback, relevant data aimed at improved design iterations. What are the advantages
and limitations of technologically informed reports of consumption, such as that
obtainable through system logging of use and usage?

The technologies and techniques of prototypes, trials, scenarios, are the means by
which firms can merely explore possibilities with users. They, along with means to
capture appropriate user feedback, represent a kind of bridge between these worlds,
the means through which 'cultures of production' can be tentatively melded with
'cultures of use'. The aim is to reduce the risk of commercial failure by developing
the opportunity to iterate upon design flaws and weaknesses before committing to the
considerable costs of manufacturing and production. But the quality of feedback is
only relative to the approaches used to capture users thoughts, feelings and actions
regarding the product, and of course how representational the demonstrator prototype
is in mimicking the fully operational system in place, which promote learning at the
point of most information – access and emergent use.

Knowledge of antecedents

Philip Abrams (1982: 8) eloquently put it: “it is a matter of treating what people do
in the present as a struggle to create a future out of the past, of seeing the past not

changes the linkages between core concepts, "recycling" the elements. (These two types of
innovations are compared to yet other types, modular innovations and incremental innovations, which
are not expected to have disruptive effects on already existing industry structures).

026463 350
just as the womb of the present but the only raw material out of which the present
can be constructed.”319 The biologist Geoffrey Scott, writing at the beginning of the
century suggested that "things are intelligible through knowledge of their
antecedents," (Scott, 1914: p.168) while George Herbert Mead (1936) viewed that
human action takes place in a present that opens on the future. It is in terms of the
emergent present and impending future that the content and meaning of the past are
determined. Human acts are teleological rather than mechanical, thus, as Strauss
(1964) indicates; Mead's teleological evolutionism permits him "to challenge
mechanical conceptions of action and the world and to restate problems of autonomy,
freedom and innovation in evolutionary and social rather than mechanistic and
individualistic terms." (p.xviii)

The process of acknowledging antecedents must apply to varying degrees, in the case
of where designer-producers and consumer-users come to apprehend and anticipate
the value of proposed and actual products. Acknowledging and referencing
antecedents acts as the 'departure point' for comparison. The development of this
knowledge is certainly more than a simple linear process, a straightforward
transmission of value and meaning 'downstream' to awaiting non-discerning and
passive individuals. As previously stated, there are both shared and disputed
meanings across value and distribution chains. They exist from the very earliest
stages of product ideation through to the final disposal of the product, or the
discontinuation of providing or using a service. They arise from a 'Janus-faced'
calculus of what came before, the pre-ordained associated benefits and problems, and
contrasted with what is promised in the present and future.

Under the influence of what Steadman (1979) refers to as 'historical determinism',


every level of social strata, from the personal, individuated self, to the policy and
regulatory activities of standards committees and governments identify opportunities,
and make forecasts based upon historical or experiential reference. It is only by
looking back that we look forward and proceed and consider what can be changed to
319
Abrams, Phillip. 1 982 . Hi s tor i c a l Soc iology. I tha c a , NY: Corne l l Unive r s i ty Pr e s
s

026463 351
improve matters. Government regulatory bodies often plays ‘catch me up’ in matters
of technological innovation, although this was not the case in the present study where
the Independent Television Commission (ITC) a regulator and strategic technology
developer for UK commercial TV were involved early in the project.

While the Janus-faced spirit is infused into the processes of iteration, planning and
design, it crystallises (following Schein's, 1985, conceptualisation of culture detailed
earlier) most ostensibly in the more tangible attributes of a technological product.
This means especially those aspects which characterise it in a developer's or
consumer's mind through its innovation and diffusion into markets and homes. As
McLuhan had it a new medium reveals "the linements and assumptions as it were, of
an old medium."(1960: p.567) However, the features that it is given, and the
functions that it is assigned are not fixed but can open to misrepresentation,
reinterpretation or even customisation (Pinch and Bijker, 1989; Orlikowski, 1992;
Westrum, 1991) For instance, which of its components, aspects, ingredients have
been truly based upon previously established phenomena, practices, beliefs, ideas
and/or artefacts? How does this compare with which are recognisably new and
novel? To whom, and under which conditions does it represent, constitutes and
define itself as an innovation?

Pre-history of interactive television

To set some historical context to the core subject of the present study, the previous
three decades leading to the Cambridge Interactive Television Trial have witnessed a
succession of technical experiments whose main purpose was to make television an
interactive experience which would be valued by the public. The nature of these
experiments had a common interest - the creation of technology and business
infrastructures, bound to type of programming which would augment the basic
broadcasting model. The object was to enable new channels for the sale of goods,
services and information, and to test out and realise new markets.

026463 352
The idea of two-way television may be traced back to the earliest text transmission of
a two-way 'videophone' conversation in the 1920s or even to Logie Baird when he
transmitted two-way pictures over standard telephone lines between London and
Glasgow in the late 1920s. (Wheen, 1985) Carey (1996) and Carey and O'Hara
(1985/1995) trace the notion of interactive television programming back to Wink-
Dink – a programme on American children's television shown in the 1950s. A 'low-
tech' solution, children placed a protective screen over the set to draw upon when
requested by the show's host. Ted Nelson, and while he is most often noted today for
originating the term hypertext, much of his 1974 book Computer Lib/Dream
Machines was devoted to the utopian, liberating possibilities of the CRT as a
graphical display for computers. Dream Machines prophesies future applications of
computers with screens as playthings and tools of communication, “to help people
write, think and show.”

“…a book, a TV show and a penny arcade…a vast tapestry of information, all
in plain English (spiced with a few magic tricks on the screen), which the
reader may attack and play for the things he wants, branching and jumping on
the screen, using simple controls as if he were driving a car…A person is
writing to other people, just as before, but on magical paper he can cut up and
tie in knots and fly around on. (p.58)320

In 1972 the FCC in the US dictated that all new cable TV systems should have the
capability to provide two way communications, and in an early paper on interactive
television Buckelew and Penniman (1974) outline six contemporary i-Tv
'experiments' of this time. It is also interesting to note that their paper promotes
discussion of 'social implications':
"Social, economic and political implications of importance are yet to be
considered. Along with the further speed and ease with which daily life can
be conducted with the help of electronic media looms the possibility of
monitoring the user, with its concomitant danger to privacy . . . But for good

320
Computer Lib/Dream Machines can be a challenging book to cite, as it has two halves with
separate pagination, and it underwent multiple editions. I am citing page numbers parenthetically from
the Dream Machines half of a volume which is both the “first edition” and the “Ninth printing,
September 1983,” which contains some post-1974 pages.

026463 353
or bad, the technology of interactive television is available today, and will be
used." (p.54)

The Buckelew and Penniman paper highlights the recursive nature of fears regarding
media when it takes on ‘new powers’. There is definite interest in its 'societal
effects,'321 privacy continues to be a thorny issue with regards to digital networks, 322
and the negative influence of mass media on the violent disposition of vulnerable
members of society is a continual source of public concern. 323 Indeed, and of some
note, much of what was being trialed in the experiments that they cite, has come to
be trialed in subsequent attempts at producing i-Tv services, or indeed within the
Internet.

Non-disclosure agreements and confidentiality clauses [and patents] often stifle


learning between firms, trials and projects, and strategically this is precisely what
they are meant to do. They may do so to incite publicity. In many cases often 'what
really happened' within the unfolding of a trial only comes to light some time after a
person leaves a company and the company folds, changes hands or the contract
becomes null. The result is that learning is hindered at an industry sectorial level, in
the same way that early engineers and craftsmen were hindered by the lack of
professional societies and guilds which could facilitate the sharing of knowledge.
Any accumulation of knowledge necessary to perpetuate 'path-dependencies',
'critical inflection points', 'paradigm shifts' or 'bifurcations' does not occur. This of
course is more damaging to 'network-based' operations such as the Cambridge trial
than traditional businesses making 'stand-alone' products. Such operations require a
wide subscription to their technology in order to be successful (Noble, 1982; Kelly,
1997).

Throughout the 1990s there was a proliferation of attempts to make television


interactive. Many of these sought, as did the Cambridge Trial, to exploit recent
321
They cite that the two-way capability of this system would lead to revolutionary new public and
private services including; remote diagnosis and prescriptions; democratic interaction with local state
and federal government; education; pay cable-TV; merchandising; advertising testing; audience
surveys and polls; credit card verification; security systems.
322
See Burke (2000) for a recent overview of these fears with particular regard to i-Tv.
323
See Barker and Petley (1997) for an overview of the ‘video violence debate’, and the institution of
the ‘moral panic’ within society.

026463 354
advances in digital technology and networks. Digital technology enabled the
transmission of ever more sophisticated forms of multimedia data. A consultancy
report from 1996 cites some 135 trials worldwide of interactive video services. 324
Another cited in the Financial Times publication Multimedia Business Analyst also
suggests the wide diffusion of various kinds of interactive television services.
"Interactive television is expected to enter the commercial phase of its development
by mid-1996."325 This forecast a commercial i-Tv was that near video on demand
(NVOD) would lead home shopping, education i-Tv shows and online games, and
home banking in that order, as they ‘rolled out’ over 1997. It predicted that full
Video on Demand (VOD) would not appear until early 1999.326

The various trials involved the use of various kinds of technologies and including
combinations of telephone, PC, and broadcast communications infrastructures. They
include services which in appearance looked more akin to what we would recognise
in the UK as Teletext and Ceefax, through various levels of technical and
presentational sophistication to systems providing VOD or NVOD. All have
implications regarding bandwidth – the amount of data that can be communicated
through an infrastructure. Teletext services demanding the least amount of
bandwidth, and full video-on-demand (such as was used on the Cambridge Trial)
requires the most.327
324
“Interactive Services-realistic Expectations, An Analysis of Video on Demand, Video Dialtone,
The Internet, On-Line Services and their Applications” Dittberner Associates, Inc. Bethesda, MD.
1996
325
Interactive TV “to reach 40% of European homes in 10 years.” Multimedia Business Analyst Vol.1.
No.12 , 5th April 1995
326
This was the industry consensus on when selected services will reach at least 5% of European
households.
327
Historically the term bandwidth was used by radiocommunications engineers to refer to the amount
of radiocommunications spectrum available or necessary for carrying an (often analogue) signal for a
particular purpose. For example, a telephone call normally uses of the order of 4 KHz of bandwidth;
a television signal requires 7MHz of bandwidth. With the event of digital communications systems,
and in particular the Internet, the term bandwidth is a term capable of different meanings. Here it has
been used more generally to refer to the measure of throughput capacity of a given communications
network link or transmission protocol. In relation to digital transmission of data, the amount of
bandwidth between sender and recipient determines how much data can be transmitted per unit of
time. It is measured in bits per second (bits/s) or Kbits/s, Mbits/s and so on. A typical residential
modem for example, may transmit in the range of 28.8 Kbits/s through to 56 Kbits/s. Assuming there
were no other impediments this would determine the rate of flow for the data being sent. In the case of
larger businesses, their data connections might operate at 2 Mbits/s, 10 Mbits/s or higher transmission
rates. ISDN stands for Integrated Services Digital Network, and provides two 64kbps digital
communications and one 16 bit channel for signalling. ASDL (asymmetric digital subscriber line)
provides 2Mb/second downloads over regular copper telephone cable.

026463 355
The technology of interactive television

If one considers the purely functional dimensions of recent media technologies there
are several basic components. Pavlík (1996) suggests that a way to map emerging
forms of media by their primary technical functions of production, distribution,
display and storage. In the case of i-Tv these can be broken down into the
technologies that produce content material and interfaces (authorware, production
equipment, films, documentaries, etc.). Distribution is enabled by the
communications infrastructure, which enable content to be delivered to homes
(switches, cables, satellites, etc.) and for audiences to feedback data (the
'backchannel'). Connectivity is a determining factor to the success of interactive
services. The technologies that enable content presentation (the STB, decoders,
electronic programme guide, interface etc.) and finally the storage devices - the
servers. Broadband interactive television systems, such as one used in Cambridge,
are most often associated with the following technologies:
Set top box (STB) - these can be intelligent (i.e. capable of processing
information themselves) or dumb (i.e. only decoding signals and not
processing any information themselves). An addressable communications box
is needed to decode the signals as they arrive at the television; depending on
the system used it may also need to perform functions such as the
decompression of the digital signal, or the handling of the return path.
Remote control and navigation system users need a friendly interface to find
their way through all the services offered and communicate their
requirements to the central Control System. Om provided the STB for the
Cambridge Trial.

Communications infrastructure - this is the means by which signals are


exchanged with the Set Top Box and include the phone line (plain old
telephone system - POTS), integrated digital network systems (a digital
means of transferring information over phones), ADSL - Asymmetric Digital
Subscriber Line, cable (with or without supporting technologies such as ATM
- Asynchronous Transfer Mode, and or cable modems), and satellite (both
digital and analogue). Transmission System high speed links are required to
deliver the vast amounts of information in a timely manner. For the return
path in a fully interactive system there needs to be a signal going from the
user to the Control System carrying the user's requests. Cambridge Cable
provided the cable infrastructure for the trial, Advanced Telecomunnications
Modules Ltd. (ATML) and SJ Research provided switching technology.

026463 356
Head-end technology - this comprises the servers and gateways that lie at
the service providers end of the communication network. It can include
system architectures which connect to other remote servers (i.e. in the
company of service providers, or banks, warehouses etc.), or to satellite
transmission reception. Storage hierarchy & control systems - even
compressed videos require enormous amounts of storage space; the control
system must be able to service all the requests coming in - these technologies
allow for sorting requests for data. On the Cambridge Trial ICL provided
servers.

Content - by this is meant any form of source material: movies, games, news,
images, sounds, etc. which will appear on the user's television or PC screen.
This content may be provided by those in charge of the head-end technology
or may be provided by a number of other companies, which may rent space
on the head-end server, or may have user-consumer request forwarded to a
host server of their own at a specified geographical location. Compression
capabilities in most of the services can only be achieved effectively by using
digital technology; systems are required to convert the analogue signals to
digital and store them in a highly compressed format. The authoring software
for the Cambridge Trial was provided by Acorn. The content was provided by
a range of media companies including ITN and Anglia Television. In addition
interactive advertisements were to be provided by BMP DBB Needham;
online surveys by NOP; online banking by Nat West; groceries by Tesco; and
a range of other services by other firms.

In addition to the above basic technology there is a wide range of other technologies
which support the system's operation and function. These include digital
encoding/decoding technologies (for creating content and interfaces), compression
technologies (to reduce the bandwidth used in the transmission of bandwidth
intensive video and so forth), operating system technology and subscriber
management (sophisticated systems for administration, billing and encryption will be
required to ensure that the users pay for the services they use and that copyrights are
preserved).

More specifically, some of the important issues that have to be addressed when one
is designing a multimedia system are:

 Storage organization and management.


 Available physical bandwidth in the delivery path to the users

026463 357
 Quality-of-service (QoS) management (real-time delivery and adaptability to
the environment – the term QoS is general, it implies all of the properties that
are required for a flawless end-to-end service, Le., delivery of information
isochronously, so that the end user experiences a continuous flow of audio
and/or video without interruption or noticeable problems of degraded
quality.)
 Information management (indexing and retrieval).
 User satisfaction.
 Security (especially management of content rights).

A chart of the Cambridge system is detailed below.

Fig. 1.4 A basic schematic map of the Cambridge trial (from Om literature)

There are systems which claim to offer interactive television which do not use any of
the above technologies. OKTV which ran an interactive service on Scottish
television, utilised a much more simpler system which incorporates a remote control
device capable of sending coded tones down an ordinary telephone service. This is
used in conjunction with teletext, and with this system they offer games and
competitions, as well as additional television programme information.

026463 358
Digital interaction

Returning to the key theme of emergence in behaviour and technology - 'interaction'


has a more limited, and obvious, definition when considered within the boundaries of
the (hard) designed or built environment. This contrasts with interaction with (the
soft) cognitive or social structures. The architect James Marson Fitch comments that;
". . . every time the architect or urban designer erects a wall or paves a street,
he intervenes in the behavioural modes of the population of that space. The
consequence of his intervention may be major or minor, benign or malignant;
they will always be real." (Fitch, 1972: p.163)

What Fitch is saying applies as much to architecture as it does to the field of digital
communications and technological design. For instance, Steuer (1992) proposes
'interaction' as the degree to which users of a medium can influence the form or
content of the mediated environment. Rogers (1995) defined interactivity as the
degree to which participants in the communication process can exchange roles and
have control over their mutual discourse. And Jensen (1999: p.26) notes that
‘interactivity’ has recently become something of an oxymoron, a term that sexes up
interaction with digital devices and networks. But regardless of the 'myth' of
interaction, applied to digital networks and services it does imply re-configuration
not only of technical resources needed to support it, but indeed the way in which
businesses organise to relate programming, services and products to audience,
subscribers and customers.

Rafaeli (1988) defines interactivity that recognises three pertinent levels: two way
(noninteractive) communication, reactive (or quasi-interactive) communication, and
fully interactive communication. Bordewijk and Kaam (1986: p.19) outline four
possible combinations of information traffic patterns – transmission, registration,
consultation and registration (see below). These largely straddle the gap that exists
between agency and structure with respect to the possibilities of interactive media.

Table 1.2 Combinations of information traffic patterns (after Bordewijk and Kaam 1986)

Information issues by Information issues by


broadcaster/service user/consumer
operator

026463 359
Content controlled by Transmission Registration
broadcaster/service
operator
Content controlled by Consultation Conversation
user/consumer

The existing broadcast model typifies transmission. This is where the choice of
material, ownership of the material and the times it is broadcast to are the jurisdiction
of the broadcaster or service operator. All consumers receive the same information.
They receive it synchronously. Two-way radio or the telephone best technologically
represents conversation. This is where two consumers decide through consensus
control over content, the times and duration of communication. These two means
have been well served by traditional studies of the media, as they represent the
established media institutions or broadcasting and telecoms. Consultation is best
illustrated when the broadcaster/service provider owns the material, and delivers it
'on demand' at the request of the consumer. The final communication channel –
registration – contains those ways of researching and sampling the views and
behaviours of the user/consumer where the broadcaster/service provider presents a
questionnaire survey, or vignette online with a request for feedback. Alternatively,
the consumers' interactions with the systems may be tracked, logged and analysed to
infer various preferences etc.

Fig. 1.5 The possible variations of media casting

026463 360
It is obvious that interaction is dependent upon whatever there is available to interact
with. From an interactive design perspective this poses a problem: How does one
optimise interactive design to cater for users who may expect a similar myriad of
interaction possibilities that they have in the real world? For example, the an early
interactive movie-game which used multiple laser disks entitled Dragon’s Lair
(1983) needed a total of 27 minutes of animation to provide an interactive experience
that lasted for a maximum of 6 minutes (Hunter) and even then, the gameplay
consisted of little more than making a handful of left-or-right decisions about which
direction the protagonist should move. Games like Dragon’s Lair can be described as
high data intensity, low process intensity games: they shuffle around a lot of data (the
animated video clips that the player triggers through making choices), but do not
have very complex procedures of play or rulesets (the only “rules” involved are those
that determine what video clips are played when the player moves the joystick.

This brings us again back to the realms of the user, and the need for designers and
other kinds of planners and forecasters to understand not only users (fixed
representations) but use implications (dynamic and emergent processes):
"In analyzing and designing systems and software we need better means to
talk about how they may transform and/or be constrained by the contexts of
user activity: this is the only way we can hope to attain control over the
design of useful and usable systems." (Carroll, 1994: p.29)

This would certainly reduce to two major concerns – how interaction is represented
to the user prior to any form of interaction – and how the interactive experience is
channelled and led through by the interaction possibilities, the navigation and the
semantic structure of the design.

While the interactive user-audience are intended to continue to make meaning from
the type of text that have distinguished traditional content material (graphics, video,
type - the media text), they are further being asked (through the new technology and
functionality) to accommodate new forms of media vocabularies. These are the new
ways in which media information may now be encoded - enabled through the

026463 361
addition of interactive design and information architecture. From the use perspective
this includes the new ways in which they decode these vocabularies and texts. This is
embodied in use practices such as navigation and accessing. In interactive media the
assimilation of 'messages' (the media texts themselves) are not only set by what is
presented (i.e. shown on the television screen or newspaper), but also through new
gestalts of interaction and experience.

The production (or encoding) of these comprise of several major avenues of design
activity and thinking. Mok (1996) has classified these new media design dimensions.
They include information design (which relates very much to the 'vocabulary' of
how content is put across - i.e. the use of a graph rather than a table); interactive
design (i.e. how it operates or what happens when you press a button or link on a
screen); and information architecture (the paths by which the user can move
between pages and items of information). What may be entitled the interactive
choreography, the way in which these elements work together, provides the
consumer-user's experience of interactivity, through using the system, and may be
understood as the experiential innovation aspect of new media.328

The notion of information and interactive design is important. In many ways it


augments and complements technical prowess and in a way that has been previously
unrealised. It can make or break content or a service. While multimedia screens often
show aesthetic similarities to televisual graphics, the addition of interactivity
suggests new needs to order information in ways which will not upset navigation
(between screens and menus), the 'flow' of reading or viewing content material, and
its interpretability (Williams, 1974). Good information design can economise upon

328
The notion of the ‘experiential innovation’ was hinted at in chapter one in the discussion of what
differentiated the telephone as a new or radical innovation. There it was argued that it permitted the
communication of human speech and discussion over distances. In terms of use it was an innovation
which was essentially more usable than existing methods of communication such as telegraph, which
already had been used domestically, but required training (in Morse code) to transmit and receive
messages. The telephone’s usefulness was a social innovation which unfolded over a considerably
longer period than its innovation as a technology. An interesting note is that one aspect of new media
which is a radical innovation or departure from the way in which it operates is this aspect of the
unique way in which it displays, presents information, as well as the way the user-consumer must
access it - namely the interactive elements.

026463 362
digital storage, processing power and make the major contribution to a satisfying (or
perplexing) user experience (Mok, 1996).

The debate regarding primacy of action or structure in sociology is reflected in the


nature of non-interactive with interactive media. The former is pre-occupied with the
individual, considering them as purposeful historical actors, individualist and self-
directing, who select from a series of possible interaction (or consumption)
experiences because of their 'meaning' to them (i.e. favourite television
programmes). The latter places more emphasis on the environmental structure as
laying the conditions of, and environment within which, choice, or even reality, is
absolutely pre-figured and pre-constructed. The reality though, is that every
encounter is interactive, with an 'open' system, everything is experienced anew but
within schema and frames of thinking rather like McLuhan's 'forward through the
rear view mirror' concept. McLuhan viewed that humans are still evolving relative to
their use of tools, and their exploitation of the world of other people, ideas, and
environmental control (Benedetti and DeHart, 1996).

The television audience - from 'viewers' to 'consumer-users'

The television 'audience' is a homogenisation of a group that tends not only to be


very diffuse in terms of its sociology, but also in terms of the range of action – that
is, put simply, what people are really doing or thinking about when content material
is broadcast to them (Moores, 1993). simultaneously produce a problematic
restatement of the
under-pinning binaries text/audience, producer/receiver. These binaries are
inadequate because ICT networks increasingly involve actors who do not
'use' as earlier audiences used to 'watch'. The institution of commercial research into
the audience – that which denotes programme ratings - tends to obfuscate or stop-
short any depth understanding of basic questions such as whom is watching what?
Why? To what kind of effect, result or possible end? The emphasis is to provide
schedulers and controllers with some form of index of how 'likeable' or 'watched'
their programmes were in relation to others. It has been said that the 'audience' is
indeed a construction which serves best those who directly benefit from its

026463 363
reification (programme makers, audience research organisations, advertising
agencies, media buyers etc.).
" . . . the institutional organization of the industry seems designed not to enter
into active relations with audiences as already constituted trading partners,
but on the contrary to produce audiences - to invent them in its own image for
its own purposes." (Hartley, 1987: p.134)

Media researchers especially those coming from reception research, media


ethnography, and media and cultural studies have for some years now denied the
perception of 'texts' as passive experiences. They have emphasised that the acts of
'reading' texts are not passive activities but rather processes of active interpretation.
For instance Halloran (1970: p.20) purports that:
" . . .we think in terms of interaction or exchange between medium and
audience, and it is recognised that the viewer approaches every viewing
situation with a complicated piece of filtering equipment."

But some commentators feel that this level of activity (or interactivity) may differ
between individuals, and that a notion of a universally active audience may be
erroneous.
"Activity depends, to a large extent, on the social context and potential for
interaction. Elements such a mobility and loneliness are important. Reduced
mobility and greater loneliness, for example, result in habitualized media
orientations and greater reliance on the media. Attitudinal dispositions such
as affinity and perceived realism are also important. Attitudes filter media
and message selection and use. . . These attitudes, which result from past
experiences with a medium and produce expectations for further gratification-
seeking behaviour, affect meaning." (Rubin, 1994: p.427)

Most significant to the present study was that the Cambridge system represented new
ways in which service and content providers, including advertisers could research
and understand the television 'audience'. The very notion of the 'audience' as a fixed
phenomena of study was something which has been brought into contention by
commentators such as John Hartley (1987) and Ien Ang (1991). These writers and
others have stressed that it is an overly convenient classification for what is in reality
a very heterogeneous and complex phenomenon. This is the challenge towards a
notion of 'mass' applied to society in general, and more specifically to the notion of

026463 364
media and audiences.329

It denies the richness of what researchers such as Morley (1980, 1986, 1992) view as
constituting either the act of, or socio-cognitive contexts of, the television viewing
experience. The notion of 'audience' appears as a misnomer when applied to the
realm of interactive media. Interaction by its denotation suggests action. Some of this
will be more akin to typical 'reading' styles of media texts – i.e. when one is
interpreting instructions for use, or a piece of written information, or in watching a
digitised movie clip or animation – and some of it will be the pressing buttons and
registering of actions.

A direct artefact of anyone using digital networks - whether this is manifested as key
presses, the ordering of certain goods, menu choices of particular content, or simply
navigation through menu options and pages - is that each action is always registered,
and may be recorded, by the system. This form of system-logging of user activity
ensures that the very existence of people within such networks displaces and alters its
constituency in one sense or another at every given moment in time of its use. Each
action is telematic, extending the individual user through time and space initiating
further actions and interactions throughout the technical and social commercial
system. Each input registers as further data, extending beyond the goal-orientated
action promoted by the interface to create effects and outputs or otherwise changes
the state of the system in some pre-figured and pre-determined way.330

Appropriate software can convert such registered data into information about the
online behaviour of particular households or even individual users. If this is
combined with information derived from other sources, registration details, online

329
Daniel Bell (1960) recognised the fuzzy nature of the category ‘mass society’: “What strikes me
about these uses of the concept of mass society is how little they reflect or relate to the social relations
of the real world.” (p.25) The notion of ‘mass’ was a post-second world war construct (i.e. Blumer,
1939). In contrast with other social formations such as the group, and the public, the ‘mass’ was
composed of members who were acted upon, lacking self-identity and self-awareness, and incapable
of acting together in any organised way (compared with the other social categories of the group, or the
public). However, as Raymond Williams (1961: p.289) has it “there are no masses, only ways of
seeing people as masses.”
330
Of course this disincludes conversational aspects of system usage where one may have a one-to-
one chat with someone online.

026463 365
surveys, face to face interviews, usability tests and so on, the result can be a
relatively rich picture of individual users, even with respect to the relation between
online and offline activity. Ideally such research methods can inform business
strategy, logistics, or making other kinds of inferences which can feed into iterations
on the technical system design, suggest extension of the network, and drive
innovation of content and services.

In contrast to traditional modes of viewing televisual material, i-Tv audiences are


able to feedback or engage at some level in online dialogues with broadcasters,
system operators, advertisers, retailers, service providers and other users of the
system. As suggested above interactive media systems can provide for many
dimensions of communication - both implicit (through the monitoring of their
'button-pressings') and explicit (answering questionnaire requests online). Viewers
for instance can purchase goods and services online, select from assorted menus of
stored programme material, answer online surveys or interactively play games with
one another.

The position adopted throughout this book is that such technologies do constitute a
particular realm of problem for design and designers. But they also pose particular
problems for a new emergent breed of proactive television 'viewer' – the consumer-
user.

Chapter discussion

Interaction is cited as the defining characteristic, attribute, feature and function of the
new era of business and digital communications. But interaction is an ancient human
endeavour closely linked with exploration, craft, design and purpose, as much as
control, constraint and management. While we may 'interact' with what is made
available to our sensory systems at every moment in time, digitally enabled and
mediated forms of interaction have a more limited definition, their possibilities
relative to pre-existing forms of doing or accessing things, and to that which is made
available, accessible or even known to the consumer-user.

026463 366
In the case of making television interactive, which is suggested as giving one 'more
choice' and 'more control' over what is presented on, and beyond, the screen. In a
letter to the Journal of Design and Design Theory Vol.2; No.1), an interactive media
designer Heiner Jacob is, unlike many technology developers and pundits, quite
critical regarding the state of many so-called 'interactive' products and technologies:
"Most so-called "interactive" media products only pretend to be interactive.
They are at best "multiple choice" machines and therefore as interactive as a
cigarette vending machine or a TV's remote control." (Jacob, 1997: p.155)

In many respects Jacob echoes Baudrillard when he notes that personalistion is only
relative to what is made available to consumers, giving them the sense of free choice.

"No object is proposed to the consumer as a single variety. We may not be


granted the material means to buy it, but what our industrial society always
offers us a priori as a kind of collective grace and as the mark of formal
freedom, is choice. This availability of the object is the foundation of
'personalization': only if the buyer is offered a whole range of choices can he
transcend the strict necessity of his purchase and commit himself personally
to something beyond it. Indeed, we no longer even have the option of not
choosing, of buying an object on the sole grounds of its utility, for no object
these days is offered for sale on a 'zero-level' basis. Our freedom to choose
causes us to participate in a cultural system willy-nilly. It follows that the
choice in question is a specious one: to experience it as freedom is simply to
be less sensible of the fact that it is imposed upon us as such, and that through
it society as a whole is likewise imposed on us. Choosing . . . may personalize
a choice, but the most important thing about the fact of choosing is that it
assigns you a place in the overall economic order."(1968: p.141)

Baudrillard speaks here of ‘choosing’ as personalisation a keyword now used


alongside others (such as ‘customisation’) in interactive media and modern
manufacturing techniques. Today, we certainly have more choices than ever – in the
market place for goods and services, as well as in the way in which we can
communicate with one another and with those offering goods and services. Referring
to what he sees as fundamental defining features, Jacob (op. cit) lays out five
functions that taxonomises true interactive functioning:
 First – access to contents in user's own time and in the sequence they
wish to have them in. Adaptation to their abilities. Present them with the
depth they want.

026463 367
 Second – to enable the creation of personalised supplementations, links
and arrangements and tailor the given material to needs.

 Third – It must be smart, i.e. adjust to preferences and correspondingly to


change in line with the progress the user makes in learning.

 Fourth – reaction to growing familiarity with the tools and material such
that it sets increasingly tough challenges and therefore stimulates the user
anew each time. The architecture must not be so transparent that users
lose interest – there must be scope for random, spontaneous events that
cannot be planned.

 Finally – For the user to always expect to find out something new about
themselves (and others) in such interaction. All of this is technically
feasible.

In short, i-Tv as a means of media content delivery, as well as the kinds of services it
offered, represented a very distinct departure from the traditional mass media models
of production, broadcasting, watching and using television. But it fell short on
fulfilling the criteria for 'good' interactivity as outlined above. Nevertheless, the
Cambridge system, rather like what is only now being realised by firms
implementing Internet-based operations, represented an immensely potent
technology, a harbinger of social and business change - but only if this change is
made in conjunction with the use and development of technology.

Configuring a system to perform as a functional whole implicitly suggests that such


projects must be to some extent 'technologically determined' at their inception. Each
component part of the system has its own ‘trajectories, and precedents. Each part
must fit, and interoperate with the other. “An organization that is structured to learn
quickly and effectively about new component technology may be ineffective in
learning about changes in product architecture.”(Henderson and Clark, 1990;
p.28)331 But at all times, even here in this nascent stage it must be acknowledged that
there must be continual appraisal of 'imagined uses and users' in the implied and
anticipated social spheres of usage. Champions of usability engineering have stressed

331
Henderson, Rebecca M., and Kim B. Clark. "Architectural Innovation: The Reconfiguration of
Existing Product Technologies and The Failure of Established Firms." Administrative Science
Quarterly 35, no. 1 (March 1990)

026463 368
since the early 1980s the importance of an early focus on users and their
requirements during the early specification stages of product development (i.e.
Gould, 1988; Gould and Lewis, 1983; and Whiteside et al., 1988). This is a process
that must continue as the system diffuses into the sites and situations of use and its
associated marketing is developed.

“One of the basic issues in developing an artifact is the choice of mapping


between the representing world and the represented world (or between the
surface representation and the task domain being supported by the artifact). In
the mapping between the represented world and the representing world of the
artifact. the choice of representation determines how faithfully the match is
met.” Norman, Donald A (1991)332

In terms of services and content, interactive television as it was envisaged in the


Cambridge trial provided a number of services and content options which relate very
strongly to the functionality of existing TV-Centric technologies. Many of the trials
of cable-based broadband i-Tv have offered video-on-demand, shopping options,
banking, interactive forms of advertising, and so on; all of which had been tested
many years before in the trials outlined by Buckelew and Penniman (1974), and yet
again in the late 1970s early 1980s in the QUBE trial (see appendix 2). Many of
these relate to long established everyday practices, chores and routines - i.e. going to
the video shop, choosing and watching a video. Others bear relation to other kinds of
practice - on-line shopping relating to mail order shopping or shopping over the
telephone; online banking relating to phone banking; online education relating to
Open University styles of education, and so on. Many of these services already have
their digital counterpart available in some form or another on the Internet.

The digital age has brought with an almost endemic wave of broad claims regarding
the potency of the new technology to change people and society. As Knights and
Murray (1994: p.41) claim: "A market exists only so much as people believe that it
exists and act accordingly. Similarly, a technological opportunity or constraint only
exists in so much as people believe it to exist." The mass market of television, with
332
Norman, Donald A. "cognitive artifacts", i john
m. carroll (red), designing interaction, cambridge
university press, cambridge. 20 sidor. online,
1991.

026463 369
its vast revenue potential, represented a considerable carrot in the face of those
whose business is technology development.

Following Castell's notion that the real distinguishing character of the information
age is the reflexivity inherent between 'interaction', 'experience' and 'learning' these
were realised explicitly, by at least one senior manager, as key prospects within the
development of the Cambridge trial. The trial was understood to offer a unique
prospect of learning. And this learning was to focus upon emergent use and usage of
the new medium, of the interaction styles of users, and top provide hints regarding
the new kinds of organisation and business practice which were required to enable
this. The senior manager responsible for content and service development, Marcus
Penny333, saw that there was an opportunity to get users involved with the design of
content and services at an early opportunity:
". . . the potential here is actually we could get towards a situation where we
can get interaction feedback at a very early stage . . . it becomes possible to
put out a test service and get users involved very early and get them shaping
and tuning the nature of service . . . having them on a continuous basis
interacting with the services and feeding back information that will alter the
way the services is provided . . . the sort of thing that one of the advertising
guys mentioned is that one of their clients were saying 'you guys go out and
research this for a couple of years and come back and tell what we're to
do' . . . that actually won't work as this whole medium is changing the nature
of the way the business works. It allows for the first time interaction with
users right the way back fundamentally into every process of the business . . .
all this becomes possible, actually interacting and then you get some complex
dynamic relationships which have just not been possible."

He viewed that i-Tv opened the potential for instant feedback from users. This would
be a relatively short loop compared with how media and services were researched in
the past, as the system provided the means for instant feedback. This would provide a
radically innovative marketing or product/service development department with the
opportunity to test ideas out on user-consumers, and the feedback would dictate the
adoption of the new product, service or process:
" . . . most businesses are producer businesses somebody sits there in a room
cerebrating, creating something and there is a very, very long chain down to
pushing it out, and the feedback back from users back to here is very, very
imperfect . . . an individual programme producer can create something test it
333
Names have been changed to protect anonymity.

026463 370
out and get some instant feedback . . . what will that do for the nature of
television?"

Considering a system which is highly dynamic, constantly reactive, and ever


changing - as the Cambridge system indeed was in both a social and technical sense -
it could be said that there would be little opportunity for things to remain stable
enough to make inferences or formulate and ask relevant questions. Penny viewed
that this was symptomatic of much wider cultural change happening across industry
sectors: " . . .we're coming to be in a reflexive world . . . what happens is that you run
the reflexivity and I think it gets to the point of stability, how it emerges is a question
of managing through to that."

A very clear picture precipitates of the innovation of i-Tv not being driven by simply
engineering vision alone. Penny stressed the definite need for feedback, a symbiosis
of developing services and content with inputs derived from the user-consumer's
tastes, interaction styles and choices. He sees that one of most important elements of
reorientation in this new way of doing business and producing media is one must
take intimate account of the feedback and run one's business ultimately on the basis
of interaction and feedback. Again, while this has characterised much of the claims
of Internet and new economy 'personalisation and customisation' pundits (i.e.
Peppers and Rogers, 1997; McKenna, 1997), it represented the thinking of the time.

But most importantly, it is not the enabling and the conscious manufacturing of
possibilities for 'feedback loops' that will drive the mass consumer markets for i-Tv
and other forms of interactive media. Nor will they alone legitimise the levels of
investment necessary for developing a new form of domestic media. It is only a
relevant technology, bound to a set of comprehensive set of consumer-attractive
content and services, which will come to augment television's functioning and place
in people's lives. Only when a robust technology and delivery architecture combines
with a comprehensive range of good programming, will the value of 'feedback loops'
emerge as a realistic proposition. They provide the 'fuel' - the real purpose for
interaction. An entertainment system devoid of a robust technology, and devoid of
attractive programming, is like the building of a motorway where there are no cars,

026463 371
and which has no entry points.334 So it is only with all its facets in place, that it will
carry the promise of mass consumer markets - a significant commercial proposition
which legitimises considerable investment. Legitimacy has been defined as a
generalised perception or assumption that the actions of an entity are desirable,
proper, or appropriate within some socially constructed system to norms, values,
beliefs, and definitions (Suchman, 1995: p.574). Part of what formed this particular
construct was the rhetoric of 'easily recognisable' benefits which i-Tv offered over
orthodox broadcast television. But the trial, as symbol and in practice, was also an
attempt to confront new ways of figuring and refiguring the user and consumer in
processes of design iteration. This was on multiple levels:

 In changes to the communications infrastructure leading to the communications


architecture becoming robust and reliable, capable of consistent operation in lieu
of its purpose. Trialists would report on breakdowns and any other inadequacies
in operation.

 In changes to content and services. Trialists would report on the quality, depth
and breadth of content and services – was the selection satisfactory? Were there
any obvious gaps in service provision? How did they rate the quality of the
programming offered?

 To refine, explore and develop ways in which firms involved in creating the
system, its services and contents could derive learning from the consumer-user.
An artefact of digital system use is that all activities and communication through
it may be registered. This opens, along with new forms of online questionnaire
surveys and vignettes new kinds of ways of deriving knowledge from the user.
One dimension of the trial was to explore this.

 Finally, trialists would provide an essential index on the commercial value of the
system and its contents. How much would they be willing to pay, for which
content and services? This would help revise business plans and the economic
dimensions of the trial and subsequent commercial roll-out.

Over the latter half of the last century the technology of television drove the
development of the massive and complementary broadcasting industry. It also
accelerated the growth of the advertising industry. As Smythe (1981) argues the rise
of the television institutions and networks became the business of delivering
audiences to advertisers. In the 1970s, in its wake of television's success came the
334
There is already good evidence of this commercially on the web where expensive, highly designed
web sites draw no traffic.

026463 372
first augmenting technologies – the so-called TV-centric technologies of video games
machines and video recorders drove built the networks and sales (and hire) of their
software of pre-recorded video and CD-ROM. At the start of the 1980s relatively
rare and quite radical innovations, they are now commonplace and ‘at home with’
televisions. Here they have played very significant roles in driving the rise in
multiple television set use in homes, as well as forming new kinds of domestic
practice, ritual and social habits.

Conclusion

Without doubt, mass media - and in particular television - has primacy in shaping
and representing human affairs (i.e. Stevenson, 1995; and the quotes offered earlier
by Silverstone, Baudrillard and Heath), and such primacy hints at the potential for
the trial to be a very significant social and cultural event. If successful, the technical
and service prospects it represented would have had the potency to elicit change over
a wide spectrum of industrial and personal practices. TV-centric technologies
implicitly capitalise and reinforce television's unprecedented success as a consumer
product. Consider a contemporary advertisement for Netgem, which is one of a
number of platforms that allows access to the web through the television set:
"1.16 billion households with Internet access without a computer. Your
online markets are opening up . . . Netgem brings the web to its new frontier,
liberating it from the computer. With the Netbox, you have access to every
single TV household, worldwide. Bring your customers online now and
become their favourite portal. Will you let the second e-commerce revolution
pass you by? . . . T V sets with potential Internet access (1.16 billion
worldwide) . . . Computer with Internet access (150 million worldwide)"335

Now Internet set top boxes and other TV-centric network technologies seek to
become next generation consumer electronics that augment television's function and
share its privileged space in the living room. The implicit object is to convert
'viewers' into 'consumer-users'.

Discussion of mass consumer markets and the delight of users mix liberally with
other forms of technologically determinist rhetoric to spirit not only investment and

335
Financial Times, Friday April, 7th p.7 2000 http://www.netgem.com

026463 373
development, but to also create realities and myths regarding 'where things are ‘at’
and 'where they are going'. As Sharrock and Anderson, (1994) see it:
" . . .being able to couch one's proposals in terms of user considerations is a
powerful way of ensuring their acceptability." (p.16)

Elsewhere (in Nicoll, 1999), and with respect to the Cambridge Trial, I have
suggested that amongst other things 'users' may operate within design processes as
'rhetorical devices' by which design champions and mangers may win support and
favour for projects. The promise of satisfied consumers, and many of them (i.e. mass
audiences), suggests certainty and reduced risk, and focus concentration upon more
immediate requirements of expertise, resources and technical problems for
development.336

But part of this risk management problem is undermined by the emergent properties
of new technologies. Frank Webster (1995) has pointed out that the ability to
properly forecast IT trends is often obfuscated by an overemphasis on the
transformative abilities of the technology, or the way in which it breaks with what
came before. The radical alternative, the novel innovation will always make for the
more interesting press release, and so symbolically at least may be seen to dominate.
The 1990s began with the promise that we would all be existing within ‘virtual
realties’ within a few years.337 Such types of claim bound to the tendency for
editorial sensation, tends to simplify discussion of impacts, denying the 'messiness'
or the 'greyness' of the either the technical or the sociocultural complexity which
typifies much of the development and deploying of technical products and systems.
Especially large scale ones and those who rely on very sophisticated and new
components.

Indeed, many of those involved with the case outlined in the present study felt that
they were at the vanguard of a brave new movement, the one which was truly going

336
Rowe, for instance, when he writes of the “interior situational logic [of[ the decision-making
processes” p. 2 P. Rowe, Design Thinking (Cambridge: MIT Press, 1987) .
337
Indeed, technology through the advent of magazines such as Mondo 2000 and Wired technology, in
itself, was becoming something of a central cultural phenomenon. There was an almost utopian feel in
the tone of these publications. "The rush is on! Colonizing Cyberspace." (Front cover of Mondo 2000
Issue two summer 1990)

026463 374
to revolutionise the world. This belief and conviction energised and motivated the
development – giving it a 'buzz'. This coloured expectations within and without of
the firm, regarding what they were developing. No one was shy with respect to
broadcasting claims of how it would change the ways things are done across many
dimensions. From an industry perspective, 'making television interactive' would
entail nothing less than a radical reorganisation and overhaul of the entire
broadcasting, logistical, commercial and advertising sectors. From an individual
perspective, it would evoke wide-ranging changes to people's relations with
institutions and things 'outside the home,' 'outside' the immediacy of their personal
lives. It would offer benefits ranging from timesaving - more convenient
management of one's everyday affairs - by providing services such as home banking
and shopping - to offering entirely new genres of entertainment such as online
gaming.

The development of a robust communications network and technology would attract


a wide range of companies that would use the system as a delivery mechanism for
the sale and advertising of their products and services. They would come through
their foresight to learn of the new way to do business through the new channel, or
they come from fear that their competitor would beat them to it.

Such propositions seem commonplace today, embedded as they are within the well-
publicised 'power of the Internet' to revolutionise the way we access and do things,
but in 1994 there was considerable debate on whether this change would be brought
about by either i-Tv or the Internet. By the mid-1990s there were questions regarding
whether the new media model would indeed be 'broadband' interactive television or
the Internet - which was only beginning to dawn as a focus of serious commercial
interest. Both platforms had technical and service advantages and disadvantages.

The creation and offer of these new services, and the belief that they would be
subsequently taken up by consumers, helped drive the notion of i-Tv as 'a lifestyle
technology'. To help the technology and service partners more fully understand and
realise this potential a third, distinctive social group would have to be enrolled within

026463 375
the trial. These were the trialists – those who would act as surrogate consumer-users
of the system. They were expected to explicitly feedback upon their experiences of
the i-Tv service and system via a series of questionnaires, specially arranged
meetings and interviews. They would also implicitly feedback through the logging of
their interactions with the system. This would be tracked by a system-logging
mechanism.

The need for user-feedback or to otherwise ‘learn’ from users characterises a general
industry trend that became popular across industry sectors in the 1980s-1990s. For
the large part it is an attempt to realise the extent of, and remedy, presumptions that
can jeopardise the success of products and services before and after they diffuse into
the marketplace. The key problem for futurists or planners (or even researchers) is
that old challenges and fashions become new challenges and fashions in
combinations that yesterday's futurists could never have imagined. Indeed, Martin
(1991) views that it was forces beyond that of the functional and technical attributes
of the telephone, but indeed much wider, social, economic, institutional, and socio-
political influences that shaped its innovation and diffusion. From the vantage point
or horizon of old mindsets and frames, continuity is broken by unexpected,
unimagined or unanticipated influences coming from what was previously
considered some unrelated source. When enough of these instances happen, changes
occur, are suggested or even demanded. In the new world of the 'knowledge
economy' a small start-up company can undermine a huge incumbent (Kelly, 1997).
Some icons of British industry and retailing such as Rover and British Home Stores
suffered badly in the late 1990s, certainly compared with 'dot.com' start-ups which
received massively inflated stock market evaluations. Many of these firms did not
manufacture any tangible goods, nor did many hold stock or inventory, neither did
they in many cases handle customer care and fulfilment. But yet key players, mostly
investors and venture capitalists, believed them to hold the power to undermine huge
incumbents.

Two main strands of development occurred in the thinking that focussed this study
and they have been introduced in this chapter. The first was reconceptualising the

026463 376
notion of 'usability’ with relation to ‘television’. This would account for television in
its wider phenomenological context, as well as place as a usable technology within
people’s lives. The second, while linked to the first, concerns much more the
organisational dimension involved in creating usable ‘user knowledge’ or
‘knowledge of the user’ which may, or may not, inform the design of everyday
products such as television. This second strand came to impinge reflexively upon the
original direction of the research project itself, calling for a refocus of the study away
from [end-] users and towards the producers. Both strands draw attention to the
nature of context in studies of products and services. They draw attention also to the
complex socio-cognitive dimensions that constitute the notion of 'the familiar' 'the
everyday', and ' the home'. Television and other media technologies clearly constitute
a special case for usability. Cooper and Press (1995) for instance illustrate this
graphically when they suggest metaphorically that: " . . . the Booker prize is not
awarded to Jeffery Archer or Jackie Collins although their work is "usable" to more
people than that of Salman Rushdie." (p.18) Usability, as either research objective, or
as a distinguishing quality, seems to straddle the worlds of design, production, and
use. In some cases the uses and the user of the product are often quite different from
the producers and designers imagined (i.e. Grint & Woolgar, 1997). That way the
design does not necessarily determine the actual use of the product. However, once
the product is materialised, it defines some limits to its use and users. Ruth Schwartz
Cowan (1983) says that we can use tools in many ways but not in infinite number of
ways. The possibilities for the user to reshape the artefact itself are of course limited
or at least constrained by the design work on the artefact. However, this does
certainly not eliminate possibilities for change. Non-intended consequences and use
outcomes are often picked up by the designers and incorporated in the next product
release development. Another possibility to reshape the artefact itself is that savvy
individual users tweak and modify the artefact to use it for their (different) intended
task. The realisation of user innovation is beginning to be understood as a real source
for innovation (Von Hippel, 1996).

There have been a variety of quite distinct technologies whose aim is to augment the
basic functionality of the television receiver and the roof-top antenna. Some of these

026463 377
have been very successful. Within this category lie video games consoles, video
recording machines (VCRs), and teletext and videotext systems, as well as satellite,
cable and digital decoders. Each of these, and others such as the videophone,
comprise of totally new combinations of features at the time of their introduction
(Ortt and Schoormans, 1993). Even the humble television 'phone-in' - where people
are requested to register their opinion, take part in a discussion, or provide the
answer to a question - represent concerted effort on behalf of broadcasters to bridge
the physical, cultural and symbolic gap between them and their audiences, for the
purpose of interaction. However humble from a purely technological perspective
these types of programmes, and the wide public exposure they get via broadcast
television, lay the foundations for awareness of interaction possibilities by the
television audience. These technologies and practices have played a distinct role in
shaping attitudes and perceptions regarding what television 'is', 'is not' or what 'it can'
or 'cannot be'. For instance an existing core function of VCRs is to shift through pre-
recorded material. This function required nothing short of a kind of re-inventing on
the Cambridge Trial system, where fast-forwarding and rewinding digitised MPEG 338
encoded films needed considerable development effort. More than a superficial
relation, the very notion of 'video-on-demand' is directly inspired from the time-
shifting properties of traditional video machines. In other words, these technologies
have shaped perceptions regarding the value and purpose of what the box in our
living rooms can do.339 Moreover, TV-centric technologies and practices have altered
perceptions of how television may be used. Not only do they permit new capabilities
for broadcasters to interact with their audiences they provide new means through

338
Motion Picture Experts Group: A committee composed of technical professionals from different
industries dedicated to forming a open standards for the transmission of digitised video and audio for
computer and television networks. More usually the internationally agreed standard for video
compression which can allow full motion video to be played on digital equipment. MPEG 1 has
been approved by the International transfer rate of 1544 Mbps, compressed typically at 40:1 ratio. It is
often used for low quality such as that used to transmit video across the Internet or from a multimedia
CD-ROM. MPEG 2 (what was used on the Om Cambridge Trial) is generally compressed at 30:1, a
similar quality to VHS, but takes more bandwidth (6-8 Mbps), and is used for European digital
broadcasting and DVD (qv). MPEG3 would address the needs of HighDefinition Television (HDTV).
MPEG4 is used for video conferencing.
339
The inclusion of networked games on the Cambridge system was in direct response to the
contemporary popularity of id software’s Doom. This game, originally released as shareware, was a
landmark in networked computer gaming, allowing players to enter into the same virtual space in
order to fight each other or collaborate against a plethora of ‘enemies.’ The intention was to adapt an
earlier id game - Wolfenstein 3-D - to work over the system.

026463 378
which retailers (and wholesalers, advertisers and market researchers etc.) may
interact with their consumers. However, the use of the telephone is most
distinguished from the telegraph in terms of its usability;
"The telephone continues the verbal tradition because it operates with the
human voice and requires no special codes, training or skills as did the
telegraph. Phoning is as easy and natural as talking . . . The psychological
importance of human speech is not always appreciated . . . every time we
hold a conversation we relate to another person, not a thing; their replies to
our remarks reinforce the sense of our existence . . . the telephone exists as an
extension of nature in this way." (Cherry, 1977: p.123)

There was no need for users of the telephone to learn a 'meta' language such as
semaphore or Morse code in order to communicate over distances. The telephone
enabled communication to more natural and immediate, and therefore more usable.
Such accessibility of use enhanced existing ways of exchanging ideas, social
networking, co-ordinating, and business between people (de Sola Pool, 1977). The
development of these antecedent technologies and practices – bound to their
acceptance [or non-acceptance] in the market place - suggest something of what may,
or may not, be successful in a mass domestic network. They may also provide hints
at the initial functional aspects that an i-Tv service can deliver to the consumer. For
the large part this is what inspired firms involved with i-Tv trials with respect to
content provision.

026463 379
Chapter 3 – The social and the technical

"The sky above the port was the color of television turned to a dead channel
William Gibson's, Neuromancer

Culture is what is left over after you have forgotten all you have definitely set out to
learn…One always feels that a merely educated man holds his philosophical views
as if they were so many pennies in his pocket. They are separate from his life.
Whereas with a cultured man there is no gap or lacuna between his opinions and his
life. Both are dominated by the same organic, inevitable fatality. They are what he
is. John Cowper Powys in The Meaning of Culture (1929)

People who fly have a different view of the world than those who spend their lives
on the ground. A very wise man once wrote a poem while he was flying, and he
called this poem “The God’s Eye View,” and he said that this view was entirely
different than the view he always had on the ground, which he called “The Bug’s
Eye View.”

026463 380
Out there, somewhere, in the air we fly through, exists an old Persian legend much
like this poem about a bug who spent his entire life in the world’s most beautifully
designed Persian rug. All the bug ever saw in his lifetime were his problems. They
stood up all around him. He couldn’t see over the top of them, and he had to fight
his way through these tufts of wool in the rug to find the crumbs that people had
spilled on the rug. And the tragedy of the story of the bug in the rug was this: that he
lived and he died in the world’s most beautifully designed rug, but he never once
knew that he spent his life inside something which had a pattern. Even if he, this
bug, had even once gotten above the rug so that he could have seen all of it, he
would have discovered something – that the very things he called his problems were
a part of the pattern.

Have you ever felt like that bug in the rug? That you are so surrounded by your
problems that you can’t see any pattern to the world in which you live? Have you
heard anybody say lately that the world is a total mess? That, my friends, is the
Bug’s Eye View, and seeing only a little of the world, we might be inclined to think
that this is true. Sampled monologue on the tracks Harmonic, Cold Krush Cuts, by
Cold Crush, Ninja Tune, 1997

Introduction

In 1879, Sir William Preece, the then chief engineer of the Post Office, was guarded
regarding the potentials of telephony to change existing practices:
"I fancy descriptions we get of its use in America are a little exaggerated,
though there are conditions in America which necessitate the use of such
instruments more than here. Here we have a superabundance of messengers,
errand boys and things of that kind . . . the absence of servants have
compelled Americans to adopt communication systems for domestic
purposes. Few have worked at the telephone much more than I have. I have a
telephone in my office, but more for show. If I want to send a message - I use
a sounder or employ a boy to take it." (Preece, quoted in Dilts, 1941: p.11)

026463 381
Compare this with a vision of Alexander Graham Bell from roughly the same period:
"At the present time we have a perfect network of gas pipes and water pipes
throughout our large cities. We have main pipes laid under the streets
communicating by side pipes with various dwellings, enabling members to
draw their supplies of gas and water from a common source. In a similar
manner, it is conceivable that cables of telephone wires could be laid
underground, or suspended over head, communicating by branch wires with
private dwellings, country houses, shops, manufactories, etc, etc. uniting
them through the main cable with central office where the wires could be
connected as desired, establishing direct communication between any two
places in the city. Such a plan as this, through impractical at the present
moment, I firmly believe, would be the outcome of the introduction of the
telephone to the public. Not only so, but I believe, in the future, wires will
unite the head offices of the Telephone company in different cities, and a man
in one part of the country may communicate by word of mouth with another
in a different place." (Quoted in Winston, 1986: p.338)

Iuso (1975) in his critique of product concept testing indicates that visions,
projections, perceptions, attitudes, beliefs – mental phenomena – have never by
themselves guaranteed the success or failure of products. But what we think, how we
think about technologies will make a difference to whether they are immediately
purchased, adopted, kept in view, put on the shelf, placed in the filing cabinet,
thrown away or on a ‘must have’ list. It depends also on who we are politically, in a
position of authority or as a recipient or mere consumer of what is made available.
As the saying goes “there are many ways to skin a cat” and this applies to many
human endeavours including the provision of mediated information. Both Preece and
Bell are referring to the same technology, but many things distinguish Bell's vision
with Preece's evaluation and value judgment. Beyond the inherent class and status
distinctions inherent in Preece's view, he performs a kind of substitution and calculus
of function – i.e. one's ability and status to call upon the 'superabundance' of human
servants which negate any need for technology.340 Preece seems to ignore the
message that was all around at the time – how mechanised production had supplanted
human labour and cwhy should this not also be so in terms of communications. He
was already in charge of a network organisation the postal service. Preece shows a
classic example in Boundary critique (BC) - the concept in critical systems thinking,
according to Ulrich (2002) that states that "both the meaning and the validity of
340
Picking up from the discussion in the previous chapter, here is an explicit example of the ‘use’ or
‘utility’ of human beings placed against the ‘use’ or ‘utility’ of technology.

026463 382
professional propositions always depend on boundary judgments as to what 'facts'
(observation) and 'norms' (valuation standards) are to be considered relevant" or not.
341

He also misses what is perhaps most striking aspect captured in Bell's idea when he
brings forth the most prevalent feature of the telephone as a real innovation – the
ability to communicate in real time – one to one or word of mouth – over
considerable distance.342 Thus there is a distinctive focus upon a new kind of network
– different spaces such as home, work factories etc. – and connection as the real
power feature of the telephone.

Imagine a city ten or twenty years in the future, with parks and flowers and
lakes, where the air is crystal clear and most cars are kept in large parking
lots on the outskirts. The high-rise buildings are not too close, so they all
have good views, and everyone living in the city can walk through the
gardens or rain-free pedestrian malls to shops, restaurants, or pubs. The city
has cabling under the streets and new forms of radio that provide all manner
of communication facilities. The television sets, which can pick up many
more channels than today's television, can also be used in conjunction with
small keyboards to provide a multitude of communication services. The more
affiuent citizens have 7-foot television screens, or even larger. There is less
need for physical travel than in an earlier era. Banking can be done from
home, and so can as much shopping as is desired. There is good delivery
service. Working at home is encouraged and is made easy for some by the
videophones that transmit pictures and documents as well as speech.
Meetings and symposia can be held with the participants in distant locations.
Some homes have machines that receive transmitted documents. With these
machines one can obtain business paperwork, news items selected to match
one's interests, financial or stock market reports, mail, bank statements,
airline schedules, and so on. Many of these items, however, are best viewed
on the home screens rather than in printed form. There is almost no street
robbery, because most persons carry little cash. Restaurants and stores all
accept bank cards, which are read by machines and can be used only by their
owners. When these cards are used to make payments, funds may be
automatically transferred between the requisite bank accounts by
telecommunications. Citizens can wear radio devices for automatically
calling police or ambulances if they wish. Homes have burglar and fire
alarms connected to the police and fire stations. Industry is to a major extent

341
Ulrich, W. (2002). "Boundary Critique". in: The Informed Student Guide to Management Science,
ed. by H.G. Daellenbach and Robert L. Flood, London: Thomson Learning, p. 41f.
342
This compared with the use of either humans, or the telegraph. In the first case messengers could
only convey verbal or written messages. In the second case the intermediate symbolic technology of
Morse code had to be known to both the receiver and the transmitter in order to convey the message.

026463 383
run by machines. Automated production lines and industrial robots carry out
much of the physical work, and data processing systems carry out much of
the administrative work. . . . Above all there is superlative education. History
can be learned with programs as gripping and informative as Alistair Cooke's
America. University courses modeled on England's Open University use
television and remote computers; degrees can be obtained via television.
Computer-assisted instruction, which was usually crude and unappealing in
its early days, has now become highly effective.343

Today, many forms of labour seem quite distant from providing for the lower
biological echelons of Abraham Maslow's (1954) hierarchy of needs. But it is a fact
that in the design of many products, if not all products, a wide complex of
perceptions and anticipations, as well as needs and requirements, inform shape,
purpose and function. These visions can jostle, align and compete, in projects which
define and create boundaries circumnavigating activities; this process is ongoing
with constantly shifting interests and problems emerging as time proceeds.
Alignments of interests and perceptions direct changes in both supporting
organisational structures and in the overall shape and features of a given technology.
And in a similar vein problems in technology or service development can impact
upon supporting organisation structures and the need for expertise.

The rhetoric that sells and advocates a product within the firm to senior management;
the 'pitch' that sells and advocates it to outside agencies such as venture capitalists
and other funding agencies; the ‘look’ of a product or the person in the product
advertisement, the way the advertisement creates and fosters impressions of use and
value to the market - each are examples of communications above and beyond any
simple, direct development or meeting of basic human needs (Winograd ). But more
than this rhetoric also aligns or dis-aligns common interest between parties (Molina).

Although reality always exists in a present, the telos of this reality is to be found in
the future, the future is a factor, perhaps the main factor, in directing our conduct. It
is the nature of intelligent conduct to be future-directed (Mead, 1936). By our very
nature we are strategic, fundamentally, human action is always action directed
toward the future. The past does not determine (although it does condition) human

343
J. MARTIN, THE WIRED SOCIETY 8-9 (1978).

026463 384
conduct; it is human conduct which determines the past. Visions, associations and
perceptions - mental phenomena - motivate or mitigate, help or hinder innovation.
Unlike technology, which can be considered ‘crystallised’ (Woolgar), it seems that
human thoughts, beliefs and perceptions are more fleeting and mutable, more
malleable and open to constant change. But this is only partially true. Structures of
thought and behaviour are conditioned culturally, and most definitely, socially (as in
the construction of law). Preece's appraisal of the usefulness of the telephone, bound
to his position, his status, his influence, had considerable societal and innovation
impact. It led to a hiatus in the technology being implemented in the UK.

Bell's visions of the telephone came to be realised, as this century, along with
plumbing and electric mains, telephone networks have increasingly pervaded, linked
and informated businesses and homes. Now the word is the 'local loop' – Local Area
Networks (LANs) – for the home. Each room wired with fibre optic cable or a
wireless LAN (Wi-Fi) enabling entire suites of intelligent and networked devices to
work in concert, and to link their functioning in relevant ways to outside world (such
as telematic control over domestic functioning, or fault self-reporting of electronic
products).

Weick (1990) points out that most technologies are usually in fact technological
systems; combinations or suites of technologies used together. They are not things in
themselves so much as groups of interconnected things. Technologies can be said to
contain hardware parts - the machines and tools - and software parts - the
knowledge that built them, and that expertise which is required to use them (Rogers,
1995). The knowledge involved in technology may be formal – that is laid down in
heuristics - or tacit – in the sense that action is governed more by intuitions
developed through constant exposure to particular problems (Dosi, 1988).

As indicated, knowledge, visions, beliefs etc. are not only mental phenomena but
also essentially social phenomena, socially constituted, the result being a complex of
social and technical aspects, operating at many levels, culminating and interrelating
in the creation and operation of useful technical systems. Knowledge and to some

026463 385
extent, individual powers of interpretation, planning and design are social
phenomena, socially shaped and socially constructed. Following the discussion in the
first chapter on the subtle subliminal subconscious aspects of television viewing,
Polyani (1962, 1966) argues that tacit knowledge, say the knowledge of how to ride a
bicycle without losing balance or tying one’s shoe laces, belongs to the personal
domain, but is still embodied in the meeting, the interaction, between the individual
and the culture he belongs to. There is here a contrast with the Russian psychologist
Vygotsky (1978, 1986) who strongly points out that all knowledge is social in some
way or the other, and thus contingent on social structures pre-existing in social
systems. Thus to Vygotsky knowledge exists in the collective structure existing in
social systems. Herbert Simon (1987) argued in favour of the view that tacit
knowledge can be made explicit by 'unfreezing social habits'. Simon focuses on
organisations, while Vygotsky focuses on social structures, and Polanyi has his
attention directed towards the meeting, the juncture between individual and culture.
Regardless of which view is adopted the essence of interaction between the
individual and other human beings and designed objects remains uncontested. With
respect to television we have already suggested the reflexive response between the
double articulation of television-as-technology and television-as-medium, and
television-as-indicative-of-culture, and television-as-disseminator-of-cultural-
knowledge.

My object in this chapter is to outline something of the recent thinking concerning


how to conceive of the whole in the processes of technological and service
development. 'Conceiving of the whole' has been the prerogative of general systems
theory (GST) and its related disciplines. With respect to technological development
this means accounting for cognitive, social, symbolic and technical elements that
combine or mesh to produce usable, manageable technical systems and technologies
– like television-the-technology and broadcasting-as-performance. It examines the
merits and shortcomings of two particular approaches to this problem – actor-
network theory (ANT) and sociotechnical constituencies and considers the
applicability of these approaches to mapping the 'big picture' of the case study
presented later.

026463 386
The present study came to accommodate a view that both traditional broadcast and
interactive forms of television demanded a modified or expanded view of usability as
it was being treated in a contemporary sense. This I will return to and address more
specifically in the succeeding chapter. However, a second very distinctive strand of
development within the present study was a shifting of focus from a pre-occupation
with users (i.e. in a wash of fervent claims about privileging their reporting on their
experiential aspects of a technical system), to one which considered how ‘user
research’ as a design and development knowledge generating practice was situated
within the wider social constituencies of the trial. More specifically, this concerned
the organisational and knowledge generating aspects of technology and marketing
trials in general. Little direct work has been done on this subject, and the literature
drawn upon to help with this aspect of the study is relatively new, although most
likely has roots in system theory, and so these approaches form the main impetus of
this chapter.

The Cambridge trial represented a formidable social and technical undertaking


especially when viewed from an innovation or organisational perspective. Indeed,
considerable effort was put into engineering the social aspects of the trial.344 While
the trial may have been driven by a range of opportunities, fuelled by a succession of
strong and often flamboyant beliefs and visions, the governance and organisation of
the trial evolved subject to the succession of unforeseen internal and external stresses
and confluences – commercial, technical, organisational etc. – which in the business
world continually beleaguer intentions, plans and anticipations.

Since the work of Lucy Suchman in the 1980s (i.e. Suchman, 1983, 1988, 1991,
1992), the design, innovation and diffusion of many technological systems have
come under an increasing scrutiny as subjects which are sensitive to the situations

344
Indeed, upon completing the advanced chipset design at an early stage in the inception of the
project, the Chief Scientist at Acorn, turned her attention to designing the layout and characteristics of
the Om facility. This brings to mind Donald MacKenzie’s remark in Inventing Accuracy “People had
to be engineered, too - persuaded to suspend their doubts, induced to provide resources, trained and
motivated to play their parts in a production process unprecedented in its demands. Successfully
inventing the technology, turned out to be heterogeneous engineering, the engineering of the social as
well as the physical world.” (p. 28)

026463 387
and conditions of use- their use contexts. "Situated action is an emergent property of
moment-by-moment interactions between actors and between actors and the
environments of their actions." (1988: p.179). She also stressed the contrast and
disparity between rationalist plans and situated action, as well as the role of
language in constituting human interpretation of situations (which she shares with
Berger and Luckman, 1966; Winograd and Flores, 1988 and others who take a more
interpretist and constructivist slant on the inquiry of social realities).

To affect proper design, many systems simply cannot ignore depth understanding of
local knowledge and experience on behalf of developers, and of social contingency
which demands a closer collaboration process between those who produce products
and those that use and consume them. 345 A service or product's value, and the proper
establishment of their consumption and use, can only be properly realised, indeed
comprehended, by their deployment into the environments and conditions of their
operation, consumption and appropriation. This includes the home as well as the
workspace or factory floor [Fleck, and Von Hippel] and of course with the rise of
mobile communications ‘anyplace’. In many senses this is an approach which came
to reflexively shape the study itself, as it as shaped by the action of conducting the
research (in a 'grounded theory' style of approach, i.e. Glaser and Strauss,1967; and
see chapters 4 and 7 below).

The trial's evolution influenced and shaped the present study, most obviously with
respect to access to the trialists. Who could speak to them, what could or would be
asked, as well as where and when they would be asked, were each originally and
exclusively the sole prerogative of Om. However, it eventually came under the
jurisdiction of a working group responsible for all marketing and user research issues
within the trial – part of a value-laden package which came with access to the
technical system. The working group comprised of representatives from the
consortium of firms who were producing content and services. Each belonged to
quite distinctive, and in many ways, complementary, industry sectors. Each also had

345
Von Hippel (1990) also points out that some problems are hard to separate from the context and
condition from which they arise, and Fleck, (1994) suggests that if configurational technologies are to
be successful they; “demand substantial user input and effort and such inputs can provide the raw
material for significant innovation.” (pp.637-638)

026463 388
very specific interests in both the processes and outcomes of the user research. They
also manifested interests that sometimes conflicted, often due to differences in
corporate cultures or ways in which the firms perceived customers, viewers, subjects
etc. This was an obvious disjuncture or misalignment which would prove very
difficult to resolve and negotiate.

As a vivid example each had quite different perceptions of the 'user' or 'consumer' as
subject and this led to competing or conflicting interests in what should be
investigated, and how it should be investigated. Just as a computer system has a
‘user’, a bank has ‘customers’; a marketing company has ‘responders’, broadcasters
have ‘viewers’ and ‘listeners’, game ‘players’ and so on. Even within these
categories were nuanced differences, say between a bank’s ‘customers’ and a
supermarket’s ‘customers’. As a matter of approaching and managing information
garnered from trialists on the system there emerged different requirements regarding,
say, the confidentiality of material gathered, or the occasions when one could
legitimately contact them for the purposes of questioning. In a very tangible sense
these were not merely semantic differences, but highlighted very significant
differences between firms and institutions in the way they traditional approach
people in societies, and how the attempt to use a single channel for interaction with
members of society [the trialists in this case], was flagging precisely how this
medium was capable of placing pressure on ‘business as usual’ – that is existing
practices and ways of doing things. In essence, this pressure to review, possibly
reinvent and innovate upon existing practice was precisely why those firms and
institutions were involved in the trial (and others which were ongoing concurrently)
in the first place. Ironically, and perhaps of their own internal organisation some
were quick to change tact and adapt to change (say, the advertising firm), whereas
others were much slower. Again, this inertia may not perhaps caused by the nature of
the business itself or even its sector - some industry sectors can be considered more
innovative or open to change than others, i.e. automotive industry compared to
biotechnology – nor by company culture, but in fact external factors impacting the
industry such as regulatory regimes or legal frameworks.

026463 389
With respect to the present study other factors, apart from social considerations,
prompted continual iteration and revision to the originally proposed research which
was to investigate human factors with regards to the interface design. Chief amongst
these was the status of the technological system in terms of its development as a fully
functioning system, capable of meeting the espoused visions of engineers, managers
and the press. The functionality of the system or content material varied over time. In
some cases content material did not vary. The interfaces were varied. This had a
direct impact upon user (and partner) perceptions of the system. As the early stage
trialists were employees of the technical partners on the project, and later, phase two,
trialists a broader mix of content partner employees there were also considerable
problems with trialist recruitment. Most of these participants knew that the service
was handicapped by functional problems and content problems from the start.

Taken together, all these events or processes constituted a major influence hindering
or demanding change to research proposals. After all, what was the trial evaluating?
What was I as a researcher evaluating, was I for instance engaged in a meta-
evaluation of the trial itself? If users were only able to view half a film due to
technical problems, what was the value, in exploring the usability of the remote
control interface? Problems with trialist recruitment were also cited as a result of
'rumours' regarding the impotency of the system to convey meaningful programming.
At the very onset of the study such determining hurdles came to demand agility and
flexibility in approach. It also constantly drew attention to the fact that the trial must
be considered as a whole.

Coping with these developments, as and when they arose, became a necessary
condition of the study. The diversity and dynaminism of the project laid the
foundation for a shift in emphasis within the study to include observation of the
social and organisational process of the trial. This expanded view of the trial and its
processes included consideration of how user research was negotiated and managed,
how knowledge was not only created but how it flowed and interacted with emerging
issues, within complex organisational environments.

026463 390
General systems theory and holism

With origins 1940s and widely associated with the biologist Ludwig von Bertalanffy,
general system theory (GST) proposes that real systems are open to, and interact
with, their environments (i.e. Von Bertalanffy, 1949). Biologist Ludwig von
Bertalanffy first introduced the idea of a systems approach as part of his General
Systems Theory, which he used to explore the relationships between organisms and
the environment. Von Bertalanffy wanted to further identify the connection between
business organizations and their environment. To do this, he explored the
relationships between employees, customers, and company output. His theories can
best be illustrated by this quote from his compilation of articles titled General System
Theory: Foundations, Development, Applications. “We may state as characteristic of
modern science that this scheme of isolable units acting in one-way causality has
proved to be insufficient. Hence the appearance, in all fields of science, of notions
like wholeness, holistic, organismic, gestalt, etc., which all signify that, in the last
resort, we must think in terms of systems of elements in mutual interaction.”
(Bertalanffy 1968, p. 45) What he means is that each element must be looked at
individually. Additionally, the result of the interactions must be explored, as well as
the whole that is created as a result of those interactions. Bertalanffy built his
approach based on the theories of Stafford Beer and Kenneth Boulding, both
management scientists. Born in 1926, Beer was involved in British operational
research and the study of complex social systems. He also was instrumental in the
idea of combining cybernetics with management systems. His first book, Cybernetics
and Management, was published in 1959. Boulding was an economist who ventured
into many disciplines outside of his original area of study. A spiritual man, his
beliefs led him to identify three types of social systems. These systems are:

 exchange systems: activity is organized through the marketing function, and


this system is driven by self-interest;

026463 391
 threat systems: outcomes are based on the threat of loss, and this system is
driven by fear and love; and

 integrative, or love systems: there is an integration of utility functions, which


results in the situation of "what you want, I want,” and this system is also
driven by fear and love, where we express our passions and compassion for
others.

These systems, he proposed, were essential for society to thrive. They can acquire
qualitatively new properties through emergence, resulting in continual evolution.
GST is described as a series of related definitions, assumptions, and postulates about
all levels of systems from atomic particles through atoms, molecules, crystals,
viruses, cells, organs, individuals, small groups, societies, planets, solar systems and
galaxies (Miller, 1978). The Systems Approach also looks at the interaction between
the social aspect of business and technological changes. In the 1940’s, studies
conducted at the Tavistock Institute of Human Relations in London looked at how
technological changes in coalmining impacted the workers. With the development of
new methods to extract coal, workers’ roles and skills changed. As a result, the
workers experienced more stress, called in sick more often, and found that their
social work structure had been altered. These changes in this sub-system caused a
ripple effect that impacted the entire system. Once technology and the social needs of
employees formed a new equilibrium, productivity was restored. The Systems
Approach, therefore, relies on all components, or sub-systems, to work in harmony
and coordination in order to ensure the success of the larger system. It inherently
suggests the linking of human, social and non-human entities, micro and macro
linkages, local and global effects.
"Systems are bounded regions in space-time, involving energy interchange
between their parts, which are associated in functional relationships and with
their environments . . . All behaviour can be conceived of as energy exchange
within an open system or from one system to another . . . and that all living
systems tend . . . to maintain steady states of many variables . . . in an orderly
balance but within a certain range of stability." (Miller, 1956: p.32)

026463 392
Such thinking proves useful in framing why some social phenomena, institutions and
technologies can appear to remain in a steady state for some time, while others
dissipate through entropy or become unwieldy, unusable, or redundant. It shows how
some needs, habits and practices persist and others do not, and can provide some idea
on how aspects and features of technologies are kept while others are not.

Rather than reducing an entity (e.g. the human body) to the properties of its parts or
elements (e.g. organs or cells), GST focuses on the arrangement of and relations
between the parts that connect them into a whole. Holism refers to a perception of the
relatedness of things in approaching reality or a problem. Ramstrom (1974)
encourages an increased emphasis on systems thinking to comprehend the increased
interdependencies between the system and its environment, and between the various
parts of the system.

If a conceptualisation may be made regarding a useful network of these


interdependencies, one has in fact defined a system (Heylighen, 1992). Scott (1961:
p.23) argued that "the only meaningful way to study organisation is to study it as a
system" and had observed that the distinctive feature of modern organisation theory
was in its conceptualisation of an organisation as an open system. A similar view
could be made of technologies and technical systems. Quade (1975) sees that the
systems approach is a way of dealing with complexity in the analysis of the
organisation of things. One abstracts from reality and forms an image of the
interdependencies that appear to exist among diverse elements i.e. roads, gasoline
stations and cars, technology developers, content developers and users. This is
distinct from the fragmented or piecemeal approach; such as often marks the pursuit
of reductionist science – such as conducting an exhausting study into the interface
effectiveness of a product which is already deemed commercially redundant of the
network components, or a lack of interest by those who exploit the technology
commercially. Such abstraction, however, denotes that one must have a clear picture
of the potency of each component part, and must then propose a ‘snapshot’ of the
system at any given time. Such analysis often denies proper address of the dynamic

026463 393
nature of living, evolving and emerging systems. However, of importance is how
they can provide proper means to understanding relationships, which can often be
dynamic and shifting just as human minds are even when focused within the
boundaries of technical projects.

A number of fields are closely associated with the systems approach, and of these
perhaps most notable are cybernetics, complexity theory and chaos.

Cybernetics, Complexity theory and Chaos

Cybernetics - from the Greek 'kubernetes', which means 'steersmanship' - and a term
recently given a new lease of life through association with virtual reality (VR) and
the Internet (e.g. 'cyberspace' as coined by Gibson, 1984). Its origins derive from
Wiener (1948/1961) who defined it as; "the science of communication and control in
the animal and the machine." In a similar fashion to GST this is a domain that now
encompasses many of the traditional disciplines: mathematics, technology, biology
but also philosophy and sociology. In GST the study is of the structure of systems
and models, however, comprehension of a system cannot be achieved without a
constant study of the forces that impinge upon it and this invariably means
communication or control in one sense or another (Katz and Kahn, 1966).
Cybernetics studies the communication and control in a system and with other
systems – i.e. how the systems function.346 While the ideas and principles of
cybernetics and systems science are applicable to various phenomena, they are
usually applied to the study of complex systems such as organisms, ecologies,
societies and machines. These are regarded as multidimensional networks of

346
Aspects that characterise cybernetic systems are:
1. They are very complex, with many interacting components.
2. These components interact in such ways that they create multiple simultaneous
interactions among the subsystems.
3. The simultaneous interactions lead to subsystems participating in multiple processes,
thus requiring multiple levels of analysis.
4. Cybernetic systems usually grow in an opportunistic manner, instead of being designed
in an optimal manner.
5. Cybernetic systems increase in size and complexity, developing new traits while still
historically bound to previous states.
6. Positive and negative, internal and external feedback is something that cybernetic
systems are rich in. The ultimate cybernetic system is one of self-reference, self-
modelling, self-production and self-reproduction.

026463 394
information systems. Cybernetics is a key term involved with the purposeful
interaction of people with systems.

Complexity theory and the notion of self-adaptive systems are two further closely
linked concepts to GST and cybernetics, as is chaos theory. Santosus (1998) presents
a vivid but simple account of complexity theory:
"What is complexity theory? One way to understand it is to look skyward to
the avian manoeuvrings of birds. A lone bird follows simple rules of
behavior, such as when and what to eat. However, a group of birds flying
together exhibit complex, unpredictable, creative behaviors that emerge
naturally from the interactions of individual birds. For example, a flock in v-
formation is able to fly farther and faster than an individual bird. The flock
that is formed when autonomous agents - birds - interact is known as a
complex adaptive system. To fly in a flock, a bird need follow only three
simple rules: Don't bump into anything, keep up and stay in close proximity.
Yet following these rules leads to a cohesive, seemingly complicated group
of birds flying with the speed and precision of the Blue Angels." (p.6)

Complexity has been described as "at the edge of chaos." (McMaster, 1996: p.13) In
this state, patterns can be seen and even understood, but the rich interplay of
individual elements cannot be reduced – as they would in GST or cybernetics - to
easily identifiable units and relations. They are more like the cluster of birds in flight,
patterns can be discerned, with one bird leading, but this is emergent rather than
agreed by committee, or based purely upon physical prowess or other determining
factors. Complexity theory looks at these systems in ways that are organic, non-
linear and holistic. In an article that appeared in the New Scientist in 1987, Paul
Davies speaks of non-linear complex systems:
"The behaviour of nonlinear systems is enormously rich and diverse. When
driven away from equilibrium, they are liable to leap abruptly and
spontaneously into new, more complex or highly organised states.
Alternatively, they may become chaotic. Often there are certain "singular
points" where predictability breaks down, the system becoming enormously
sensitised to minute fluctuations. It is as if the system had a "free will" to
choose between different paths of evolution, to explore new possibilities."
(Davies, 1987: p.43)

According to Mitchel Resnick complexity is arising across many phenomena and


mental processes partly as a result of a process of decentralization:

026463 395
"Ideas about decentralization and self-organisation are spreading through the
culture like a virus, infecting almost all domains of life. Increasingly, people
are choosing decentralized models for the organisations and technologies they
construct in the world – and for the theories that they construct about the
world." (Resnick, 1994: p.4)

Living systems are non-linear dynamical systems which, although comprising


physical material to which all the laws of classical and quantum physics apply, show
emergent characteristics like self-organisation. Examples include complex metabolic
self-regulation in cellular and organismic structures, and various manifestations of
consciousness and cultural emergence. Emergences such as consciousness show
well-defined laws of dynamic behaviour when looked at from certain perspectives
such as cybernetics and psychology and psychotherapy, but can often appear chaotic
to the untrained eye.

From a user perspective, a digital information system such as the trial, or the
provision of water or electricity from a mains, appears as a self-organising system.
The self-organising elements of such systems are of course due not only to system
functioning – i.e. all the components and their connections working in concert doing
what they were intended to do – but the maintainece of the system and its
components by human actors. Only proper administration of these elements creates
the impression of experience of transparency at the user end. Only when service is
denied, or breakdown is experienced does the user become involved, at first
cognitively and only later behaviourally in asserting what is happening: is it a local
problem (i.e. fuse box) or a general problem (i.e. power cut) when the lights go out?
The same for devices, has the fuse blown at the socket, or is it a problem internally in
the device and so needing specialist expertise?

The social and the technical

The historian of technology, Thomas Hughes (1987), views that one of the defining
aspects of 'system technologies' is that they contain messy, complex, problem-
solving components. And a notion of chaos can also be applied to the rich social and
physical environments of the home. The networks and infrastructures of electric

026463 396
mains and the plumbing that sustain homes have long raised idiosyncratic problems
for builders. Returning momentary to the theme of domestication raised chapter one,
the networks which comprise daily life and ritual with also illustrate clearly the
interconnected ways in which the built environment and the technologies of the
home connect with human activity and ritual.

In her book Geography of Home (1999) Akiko Busch presents a lucid account of the
modern habitat and its habitation by people. Analysing the domestic spaces that
compose the modern American home, she offers fascinating insight into the changing
conditions and circumstances of our habitats. She outlines and draws attention to her
realisation of the multi-dimensional contexts and non-linear processes that comprise
everyday life and its environments. Many of these dimensions, normally invisible to,
or sublimated within our consciousness, nevertheless have real impact in smoothing
the 'chaos of experience'. Establishing the reasons for an uncharacteristic electricity
bill sent her in an unprecedented exploration of the electrical functioning of her
home, which in turn led to her 'seeing' her home as; "a network of social and cultural
currents, those habits, beliefs, and values that also make it function." (Busch, 1999:
p.163) She points to the fact that the most banal, tacit, and familiar activities and
phenomena can be highlighted as problems of a complex nature if reframed and
brought to conscious awareness in a particular way.

Technical artifacts have many functions that can only find proper explanation in
terms of their wider social and symbolic contexts (perceptual, economic, personal,
legal, symbolic, aesthetic, etc.) This is often because they are explicitly designed by
engineers to have or manifest these functions and contexts; there is always an
argument that this erodes the distinction between technical artefacts and social
'artefacts'.347 Perhaps this approach is useful so as to leverage accounts which do not
347
The notion of a function in the wider sense intended here can partly be analysed in terms of the
general notion of social function, though admittedly much conceptual work is to be done here. The
notion of a social function has been used to put forward explanations for many social phenomena, as
part of a research program called functionalism. Functionalists, such as Malinowski (1944) and
Merton (1957), held that at least a large class of social phenomena can be explained in terms of their
beneficial consequences for society. Since not all of these consequences are intended, Merton has
introduced the central idea of ‘latent function’. While functionalism has been severely criticised (e.g.,
by Jon Elster (1994), it has recently regained some of its former popularity. Kincaid (1994) defends
functionalism against the penetrating criticism levelled at it by Elster.

026463 397
privilege the social over the technical or vice-versa, or consequentially and similarly
privilege cognitive accounts over social accounts.

Social systems

Of course we do not only interact with 'things' but also with each other, and this can
be directly; for instance, in person 'face-to-face'; or indirectly; via or mediated by
technology such as phone, semaphore, Morse code, newspaper, radio, TV, letter and
e-mail. Of course for us to understand each other and communicate in Morse we
need to know and share the ability to interpret and encode the messages, just as in
order to communicate in a foreign language needs knowledge of that language. One
cannot write an email or open an email if one does not know how to use a computer
and the email software.

Interaction indirectly through technology via the design, operation and public display
of features and functions means that an intermediary, in this case, a designer or set of
designers and engineers, have fashioned a system of functionality which enables and
constrains communication with another. The ability for a mobile phone to function
for instance lies in its ability to convey messages from one person to another. This
can be hindered by several well known problems beyond the ability of the users to
communicate using language they both share and understand. Running out of battery
power, being in geographical locations where there is poor signal, running out of pre-
paid airtime, subscribing to a service which predicates some uses (i.e. same network)
over others (i.e. other mobile networks), calling from noisy environments, speaking
while doing other activities like driving all represent various conditions which input
‘noise’ in to the system and can prevent clear communication. These are technical
problems or conditions of use - where the technical interferes with the juncture of the
social.

A well-hackneyed polemic raised in the human and social sciences lies between the
primacies of cognitive over social influences, or vice versa in the shaping of human
affairs. Clearly, what one thinks and how one thinks is based upon a neurological
system which is genetically programmed (nature), and forms while one is in the
womb and subject to sounds and rhythms of the mother’s body and from beyond, the

026463 398
world of the social (nurture). As individuals develop their cognitive and motor
abilities improve and they have available increasingly a myriad of possible
interaction possibilities. But as indicated these are constrained and made finite by
cultural considerations, individual differences, and physical abilities and so forth.

The position of the fields of phenomenology, symbolic interactionism, and


ethnomethodology are upon action and its meaning (these shall be more fully
explained later). It has also been already suggested that 'interaction' in its
fundamental, essential sense, can be taken as acts of interpretation and reaction that
infuse nearly every moment of existence. More sophisticated forms of
communication and knowledge acquisition takes place, ranging from the basic needs
as depicted in Maslow’s Hierarchy which moves through the successive phases from
a desire to satisfy ‘basic needs’ for food and water to self-actualisation - a psycho-
emotional state where one feels as if they are where they ought to be, and doing what
they ought to be doing. Lack of appropriate social stimulus, as in the extreme cases
of neglected and feral children, can leave them stranded without language and
without any means of communicating in any abstract manner. This has behavioural
implications in that such children are left at the level of searching for water, food and
shelter as was depicted in the famous case of the ‘wolf boy’ who had been reared by
wild animals and behaved more or less as one. What is notable is that if such children
do not learn language by age 12 then apparently it is unlikely they will ever do so. In
contrast, zoologists have been partially successful in teaching primates basic literacy.

At the macro level of society civilisation and culture inherently denotes both ways of
seeing and ways of doing. A person's acceptance into a given civilisation and culture
often depends upon abilities to conform to social norms, and while in some senses,
this acts as a kind of tunnelling or focusing of vision towards what is culturally
deemed as important, it can also mark the development of skills that lead to a sense
of belonging. Here the precedence is upon social interactions with parents, siblings
and others as one grows and develops as an individual. These interactions shape the
person, their behaviours and attitudes, and moreover, their worldview.

026463 399
"The activities of a domain are enframed by its culture. Their meaning and
purpose are socially constructed through negotiations among present and past
members. Activities thus cohere in a way that is, in theory, if not always in
practice, accessible to members who move within this social framework.
These coherent, meaningful, and purposeful activities are authentic . . .
Authentic activities, then, are most simply defined as the ordinary practices of
the culture." (Brown et al., 1989: p.25)

Another polemic somewhat related is the relation of 'social action' to 'social structure'
(Giddens, 1979). Rising beyond the level of neurological phenomena, we do not
perceive pure forms, unrelated objects, or things as such, rather our 'reaction' to
external stimulus relates to the meanings that things have for us (i.e. Neisser, 1976)
Toulmin, et al. (1984) suggest that:

" . . . reasoning is less a way of hitting on new ideas for that we have to use
our imaginations - than it is a way of testing and sifting ideas critically. It is
concerned with how people share their ideas and thoughts in situations that
raise the question of whether those ideas are worth sharing. It is a collective
and continuing human transaction." (p.10) william19

We are aware of, and attach meaning to people, symbols and structures in the world,
but as Toulmin sees it "Our ideas are our own, but our concepts are shared". 348
Although we might support the concept of free will, this doesn’t mean I could
just choose to speak Chinese any time I like. I’m free to make up a brand new
language, or several, but if I want to be understood, if I want to relate to someone
else, I have to speak a known language – one known to me, and to those with whom I
wish to communicate. The belief is that there is not one objective reality open to all
to share, but indeed various or multitudinal realities that individuals create and
weave together in social circumstances. Our idea and perception of ‘red’ may differ,
but as long as it means ‘danger’ or ‘off’ or ‘stop’ in a given circumstance or context,
and that this concept is that this stimulus signifies this behaviour of a system then we
have the prospect of a cooperative or working reality. This is the basis of the notion

348
Of course even at the neurological level the notion of ‘passive receptivity’ to things in the world is
challenged:

“ . . . the nervous system does not collect information from the environment. It produces a
world by specifying what environmental patterns are perturbations, and what change or
alterations have caused these perturbations in the organism.” (Matura and Verela, 1987:
p167)

026463 400
of socially constructed realities (i.e. Berger & Luckman, 1966). Their thesis suggests
yet another idea introduced earlier - that new and potentially innovative meanings are
derived through communication and negotiation with others. It also states that the
realities individuals create (and may take for granted) and influence their patterns of
interaction and emphasises the formation of a relational reality via conversations,
actions and reactions, and sequence of events (Von Foerster, 1984; Gergen, 1985).
Taken to extremes social constructivists would have it that all meanings, all
interpretative processes of the phenomena we perceive originates beyond our selves
lies in the realm of social negotiation, the domain of custom, culture, ritual, habit,
orthodoxy, dogma and convention. Consider how there are few bulimics and
anorexics, those with eating disorders, in societies where there is hardly enough food
to go round. Rather such disorders mark those celebrity obsessed and image
conscious societies where food on the table is not an issue. What people do (action)
and what they can do, what their options are, which direction they can move, what
their relation is to the other/s (structure) has been of increasing focus in critical
theory and so-called continental philosophy. It is very relevant here as popular
images of the designer and engineer is of a heroic figure, one who has complete
command of his or her options when applying their creativity to solving problems
and developing worthy innovations. However, this is a romantic image; the reality is
more complex and even messy. They must cope with multiple inputs and demands
coming from various clients, managers and colleagues. In addition to this the
technology and its components or constituent parts may not entirely work as intended
or fuse systemically throwing up new entirely problems and dilemmas which also
need prioritised with the demands of clients, managers and colleagues and so on.
This is more than ‘learning by doing’ more ‘learning by struggling’. They must
consolidate the design brief with what is possible, what must happen first, and what
must be accommodated or compromised bearing mind demanding and expectant
clients and managers. Such interactions and negotiations promote important learning
in the job. Interactivity in learning is; "a necessary and fundamental mechanism for
knowledge acquisition and the development of both cognitive and physical skills."
(Barker, 1994: p:1).

026463 401
Theories of action and theories of structure have formed a broad debate in the field of
social and human sciences (Giddens, 1979). Whereas theories of action tend to be
micro-social, focused upon individual's views, theories of structure tend towards the
level of organisations at various levels. Theories of structure in sociology derive
from the work of Karl Marx, Emile Durkeheim and more recent perspectives derived
from structural linguistics. These commentators are more interested with questions of
determination, reproduction and power relations, what you can and can’t do, what
you are allowed and not allowed to do. It also raises relevant question regarding the
relations of the social self, to the cognitive, inner self. In many respects such
processes constitute the subject as a product of their structures – we are what we are
allowed to do, or be, or think, or act like. However, important to the argument of
emergence is that the reality is that both these approaches reflexively shape one
another in further incidences of negotiation, interaction and co-shaping:
"Crucial to the idea of structuration, is the theorem of the duality of structure .
. . agents and structures are not two independently given sets of phenomena, a
dualism, but represent a duality . . . the structural properties of social systems
are both medium and outcome of the practices they recursively organise."
(Giddens, 1984: p.25)

Herbert Simon for instance describes a scene that truly captures the determining
relations of structure and environment. An ant is walking on a beach. Simon notes
that the ant's path might be quite complex. But the complexity of the path, says
Simon, is not necessarily a reflection of complexity of the ant. Rather it might reflect
the complexity of the beach. The point of Simon's scenario is the suggestion that the
environment plays an often-underestimated role in influencing and constraining
behaviours. Following Simon, Resnick (1994: p.142) suggests that people think of
the environment as something to be acted upon, not something to be interacted with.
Here is a defining aspect of environment, structure, or context viewed as an open
rather than closed system. This is a critical concept underpinning the notion of a
“contextual usability’ – that is - usability defined and realised as contingent upon and
with other experiential use elements. It also links to the notion of domestic practice
and action within the context and environs of ‘home’.

026463 402
Talcott Parsons (1971) discusses the notion of social systems, defined by the
interaction of two or more persons in which each actor attempts to account for the
action of other actors or in which there are common goals of interaction. This is
similar to Rogers (1995) who defines them as "a set of interrelated units that are
engaged in joint problem solving to accomplish a common goal." (p. 24) In essence
these definitions are speaking of processes of aligning ideas and actions. However,
within the Cambridge trial, each group, each individual firm on the trial, came from
very different sectorial and operational backgrounds, they performed quite distinctive
roles in society, and within the world of commerce and public policy, with each often
seeking different aims and objectives with respect to their participation. Such
influences can lead to mis-alligment in the entire system determined to shared and,
again, negotiated goals (Molina).

Parsons derived four possibilities for the formation of systems: there are adaptive
systems (combining external reference and future orientation, e.g. the economy),
secondly systems which are specialized on goal-attainment (internal orientation,
future, e.g. the polity), thirdly systems focused on integration of system elements
(internal orientation, present time, e.g. the society conceived as a community),
fourthly systems which are responsible for the maintenance of long-term patterns
(external reference, present time, e.g. cultural institutions in society).

There is one further aspect which Parsons adds to this elementary distinction of four
types of systems. He distinguishes among these four system types systems in which a
primacy of the transfer of information obtains (all cultural institutions and systems)
from systems which are focused on transfers of energy (e.g. the adaptive economic
system). Information- rich systems control energetical systems. These, on the other
hand, are thought as conditioning factors which limit the scope of information-rich
systems. This argument was taken from Norbert Wiener and Parsons derived from it
a bidirectional hierarchy of conditions and control which interrelated all types of
systems.

026463 403
On the basis of these elementary distinctions Parsons worked for further three
decades on a social theory which identified in any concrete social system these four
universal functional aspects (adaptation, goal-attainment, integration, pattern
maintenance) which often constitute autonomous subsystems of the respective
system. In an analogy to economics he then added input/output-analysis. Systems
and subsystems are interrelated via the input and output of resources which are either
the result or the precondition of ongoing system processes. Among these resources
are the cognitive and motivational resources of participants, and the rights and values
which are attributed to them. These different types of resources are transferred in
exchange processes between systems. For analyzing these exchange processes going
on between systems, without which systems would never be able to procure the
resources they need for their functioning, Talcott Parsons created a theory about
media of exchange. Parsons started again with an analogy to economics in theorizing
about media of exchange. He postulated that there is first of all money in its
economic function as a medium of exchange, well-known to economists. Then he
added power and argued that it is best understood when analysed as analogous to
money, as an exchange medium which mediates the transfer of resources (decisions,
support, responsibility etc.) important in political processes. And after having written
theories for power and money, Talcott Parsons added further media of exchange for
input/output-processes between systems, among which influence and value
commitments play an especially prominent role on the level of societal exchanges.
Katz and Kahn (1978)349 elaborate Parson's system and describe organizations as
having five subsystems:

1. Production : (throughput, transformation of materials)


2. Supportive: (garner input resources, deal with output, gain legitimacy)
3. Maintenance: (recruitment, socialization, training, preserving the system,
rewards)
4. Adaptive: (sense environmental changes and determine meaning for org,
strategy - product research, market research, long-range planning, etc.)
5. Managerial: (control, coordinate and direct subsystems, develop policies, --
use both regulatory mechanisms (feedback) and authority structures decision
making and implementation).

349
Katz, D., & Kahn, R.L. (1978) The social psychology of organizations (2nd ed.), New York: Wiley

026463 404
In the Cambridge trial as an example there were three distinctive social groups,
constituncies or networks. The trial itself made was founded through the efforts of
two main consortia of companies. Firstly, this was one group responsible for
technology and communications infrastructure – and secondly, another group
responsible for providing content and services. Members of both these groups had a
relatively well defined, easily understood and common goal - to test the technology
and market for i-Tv.

However, the third social group - the trialists – were those who would be participants
on the trial. From the perspective of the first two groups, the trialists were intended to
act out roles as surrogate consumer-users- that is, they would provide feedback
regarding the efficiency and regarding their overall satisfaction with using the
system. There would more also be some opportunity for tailored specific questions,
pertaining to a service, its look, feel and function. However as a participating
member of the trial, direct access to users to ask them a whole gamut of specific
queries regarding one’s service offering (say, online banking) was limited by Om as
manager of the trial and its various interest groups.

Intrinsically, many trialists had different motivations in getting involved with the
trial. In the early stages it were Om engineers themselves who were the participants.
They were basically ‘taking their work home’. Important to stress is that they had
quite different objectives or goals in being involved with the trial. (c.f. the earlier
discussion on the ‘logic or use’ in comparison with the ‘logic of design’). The
trialists wanted to explore the potential of the new system to deliver entertainment
although Phase one was more basically concerned in exploring whether the system
was able to convey anything at all. Later, curiosity and novelty was the most likely
motivation for drawing people to participate. The trailists were to be central to the
resolution of visions. They would provide data and feedback that would first ground
the potential of the technology to perform its basic functional purpose. They would
also come to comment upon the viability and value of the system of content and
services. Finally they would provide some benchmark of the overall system's

026463 405
business and market potentials – i.e. what people would be willing to pay for, and
how much.

More than this, their interest, aims and objectives did not remain static. Rather like
the technology that developed and metamorphosed functionally during the trial,
interests, aims and objectives were dynamic features. The interplay of these
elements, their interaction, co-shaping and co-evaluation, created a chaos and
indeterminacy confounding realisation of anticipated outcomes. Social systems are
definitely not self-maintaining, because they do not directly generate the components
which realise themselves (their participants in fact generate the new components)
(Hejl, 1984). The applicability of self-maintenance is further complicated by the fact
that these components may participate in multiple social systems at any time, and
they have the ability to withdraw from participation entirely (i.e. trialists were not
only people whose sole activity was using the system). This is particularly true in the
artificial use environments or ‘controlled conditions’ (see later chapter on
methodology) of trials, tests and experiments of any kind. Intervening, unwanted,
unexpected elements are always waiting in the wings to contaminate naturalistic
situations of use.

In Parsons's Action Systems and Social Systems (1971) the most distinctive features
of his social theory are illustrated. First, he understands the social system to be a
distinct entity, different from but interdependent with three other action systems:
culture, personality, and the behavioural organism. Second, Parsons makes explicit
reference to Emille Durkheim in his view that social systems are sui generis things in
which values serve to maintain the patterned integrity of the system. Social systems,
like technological systems beyond their ostensive interfaces and controls are not
always visible beyond observation of the various groups, their interactions and
perceptions. These are defined as 'soft' systems as they are to an extent more mutable
and flexible than technology which has been described in terms of being 'hard'.
organisation And reorganisation at the human level are often, but not always, easier
to achieve than the revision of technical systems. Humans can change their mind
more easily than the wiring in their house or television. And thus, especially in

026463 406
certain sectors, there is the persistent need for human craftsmanship and agency
which persists in an age of automation and mechanisation.

Another notable contribution to social theory relevant to technology is Social


network analysis. This approach focuses on patterns of relations among people,
organisations, states, etc. and so is roughly equivocal to cybernetics (Berkowitz,
1982; Wellman, 1988; Wasserman & Faust, 1994). Of recent popularity in studies of
computer mediated communications (CMC) social network analysts seek to describe
networks of relations as fully as possible, tease out the prominent patterns in such
networks, trace the flow of information (and other resources) through them, and aim
to discover what effects these relations and networks have on people and
organisations. It reflects a shift from the individualism common in the social sciences
towards a structural analysis. This method suggests a redefinition of the fundamental
units of analysis and the development of new analytic methods. The unit is [now] the
relation, e.g., kinship relations among persons, communication links among officers
of an organisation, friendship structure within a small group. The interesting feature
of a relation is its pattern: it has neither age, sex, religion, income, nor attitudes;
although these may be attributes of the individuals among whom the relation exists:
"A structuralist may ask whether and to what degree friendship is transitive. He [sic]
may examine the logical consistency of a set of kin rules, the circularity of hierarchy,
or the cliquishness of friendship." (Levine & Mullins, 1978: p. 17)

Researching complexity

Complexity theory in its application in business, technology studies and society is


viewed as an alternative view to the mechanistic, linear way that typified much
organisational and managerial thinking until recently. This is often characterised by
simple cause and effect and associated predictable outcomes. Complexity theory
applied to organisation suggests that most phenomena and processes in the real world
do not reflect such linear thinking. (Wheatly, 1992; Santosus, 1998; McMaster,
1996)

026463 407
The open systems approach has been chosen to study complexity because it has been
commended for its potential usefulness in "synthesizing and analyzing complexity"
(Simon, 1969) in "live" organisations. Leavitt, Pinfield and Webb (1974) also
recommended an open-systems approach for studying contemporary organisations
that would come to exist in a fast-changing and turbulent environment such as the
information technology sector.

Complexity is an area is of great significance to researching phenomena, as it is


indeed the area of most information (McMaster, 1996). One need only think of the
occasion of landing in a strange airport, in a country where one does not speak the
language. Debilitating at first, one becomes accustomed to the chaos. Patterns are
seen to emerge, to become recognisable and familiar, and one is returned to a sense
of what Giddens refers to as ontological security. Szent-Gyorgyi (1971) speaks of
something similar with respect to the practice and process of research:
"If I go out into nature, into the unknown, to the fringes of knowledge,
everything seems mixed up and contradictory, illogical and incoherent. This
is what research does; it smoothes out contradiction and makes things simple,
logical and coherent." (pp.1-5)

The implication here is that research provides the necessary medium or lens through
which the chaos of natural world can be 'smoothed' in order to show underlying
patterns, which can make sense to the investigator, or to other trained and interested
'others' – managers, politicians, regulators, users etc. Practice, and learning by doing,
or learning by using, also provides a means through which technologies are tamed,
and made to perform in domestication and design and innovation contexts. It is how
expertise, know-how and intuitive reasoning are developed and sharpened. It is how
mere users become ‘power’ users. The patternless, randomness and noise are the
nemesis of decision-makers, long range forecasters, scientists, economists, stock
market dealers, and policy makers etc. who would wish to reduce and simplify causal
effects giving rise to phenomena in order to make predications. But these are not
chaos – chaos only appears at first lacking pattern, but beyond the apparent random
noise are complex interwoven patterns which need identified and picked out.
Simplification is crucial for the purpose of representation and generalisation at
management levels, or in the identification of trends.

026463 408
A senior manager on the Cambridge Trial made very strong allusions to chaos,
complexity and feedback with particular respect to the kinds of new management
philosophies and styles of governance that he saw were needed as a result of new
technological potentials and their need for governance. He took the view that what
they were creating was a 'infinitely reflexive' style of management and company, not
only the apotheosis of total quality management and other customer-centred
marketing philosophies, which were popular at the time, but even an entire
generation of knowledge development beyond this:
" . . . given that the whole process is interactive . . . we're actually building in
quality . . . it’s an inherent process . . . and you don't need to bring it in from
outside as a separate process." (Marcus Penny - senior manager for content
and services)

Indeed he viewed that such a system as inherently non-linear, and one of the
problems that he was wrestling with at the time was the expectation, particularly in
the process of management, that issues in interactive systems could be reduced to a
linear process. When this was not deemed possible, it seems a problem. Not just
applicable to the Cambridge Trial, this manager viewed this as an industry-wide
perspective - that there is a linear management process at the core, and everything
else is chaos. Indeed, Westrum (1991) has pointed out that:
"What is not understood cannot be controlled. If we do not unpack the way in
which technology is shaped by society, we cannot change the shaping . . .
before we can expand the area of intelligent human choice, we must
understand how technology and society interact." (p.79)

Following Henri Fayol’s notion of basic management roles which over a century ago
launched the whole field of scientific approaches to management as an activity,
monitoring and control was a keystone in what managers do. Rycroft and Kash
(1999) assert that leaders of both companies and countries must be: "continuously in
search of new concepts, rules, and models that will be useful in dealing with ever-
changing reality." (p.17) Indeed Shackle (1963: p.13) argues that the entire field of
economics resulted from: " . . . the study of how men seek to cope with two of the
great basic, inescapable conditions of life: scarcity or lack of means; and uncertainty
or lack of knowledge." Management, or getting work done through the agency of

026463 409
others, requires good quality reporting and information. Similarly, design and
technical problem solving also requires easy access to up-to-date specialised
information and knowledge in order to find the best performing components which
can get the job done. This includes knowledge of interoperability of components,
their specifications and likely fit with the intended system. While the state of the
overall economy may not be knowledge which pertains to certain development tasks
in projects, having an idea of the cost of certain components indexed to performance
and ‘fit’ may be. Being told of users or clients reactions to certain system features
and functionalities may serve to steer development towards certain areas rather than
others. Certain knowledge of the functioning of other related or even competing
technologies can lend benchmarks regarding system response and user and client’s
expectations. This leads to the idea of ‘managerially-relevant’ knowledge and what
may be termed ‘designerly-relevant’ knowledge. The former is concerned with
knowledge aimed at aiding monitoring, planning, leading and coordinating
development. The latter is concerned with optimising the design and engineering
elements of a given product and system.

Providing scope to understand, contain, control and channel complex knowledge


flows represents the endeavour, and challenge, of recent social scientific theories
aimed at distinguishing the products of, and influences upon, social and
technological action. Such ideas can provide the overviews necessary for large scale
projects.

Techniques of understanding and mapping sociotechnical systems

An immediate question arises concerning how best to map relations within an overall
system that comprises social and technical elements. Both can manifest considerable
levels of complexity within themselves, let alone considered together as a 'meshed'
whole or emergent. Influenced directly or indirectly by GST and its complementary
theories, the second half of the last century has witnessed a number of attempts
which aim to chart and illustrating the way in which cognitive, social and physical
elements interact in processes of technical innovation and development.

026463 410
In addition to GST, some of these approaches owe a legacy to the notion of 'scientific
management' (Taylor, 1911) or even Weber (1947) when he refers to the
'bureaucratic phenomena'. While these ideas came to carry negative associations,
they inspired mechanistic models of the organisation. Kling (1987) considers that the
trend in mechanistic models of technological systems, while often justifying
economic, physical and information processing aspects of developments often
ignored the context of complex social actions in which technologies are developed
and deployed. Many information systems professionals, for instance, are still locked
into a mechanistic viewpoint of organisations that tends to neglect the socio-political
and socio-cultural elements of information systems.

Wheatly (1992) asserts that: "Many management strategy theorists either were
engineers, or admired that profession . . . There has been a close connection . . .
between their scientific training and their attempts to create a systematic, rational
approach to business strategy." (p.27) Systemic views of organisation (such as
suggested by Simon, 1996) have suggested that functions can be broken down into
component parts for the purpose of analysis. Frameworks coming from academia
move in quite a different direction. Actor-network theory goes so far as to
deliberately blur any distinction between human and technical components
contributing to a system and its development and use.

What many of these frameworks have in common is a description of the space or


environment where the social or the cognitive encounters the technical. This space or
environment has been called variously the 'system', 'network', 'constituency', 'web' or
'matrix' amongst other terms. Each conveys a strong sense of connection and
interrelatedness between various elements, and so bears direct relation to the core
principles of GST. This interrelatedness can manifest in purely hardware terms (as in
a cable communications network), or in terms of software (such as the flows of
electrons in a public transport system). It may also manifest in the passage of more
symbolic collateral, such as information and knowledge exchanges and flows that
come to inform and shape the perceptions of multiple interest groups.

026463 411
Approaches, frameworks, tools tensions between the notion of 'tool' and

'framework', as what exists between 'practice' and 'theory'.

Some frameworks are aimed towards practitioners – i.e. studies of a 'managerially


relevant' or of a policy-making nature, or cognitive tools for use within industrial
settings and projects that can guide activity and organisation. Other approaches are of
a more scholarly nature, aimed at providing frameworks of analysis for academic
reporting and publication. There are similar in certain cases, industrial and academic
approaches are not mutually exclusive.350

Many of the industrial 'tools' have roots in public and private research projects. For
instance, the sociotechnical approach - used to optimise organisation - arose from
research conducted at the Tavistock Institute (whose first major project was an
investigation in British coalmines by Trist and Bamforth, 1951). And socio-technical
systems theory. Socio-technical systems rests on two principles:

1 'In any purposive organisation in which men are required to perform the
organisation's activities, there is a joint system operating, a socio-technical
system' - a social and technological system requiring joint optimization of
both.
2 Socio-technical systems are embedded in an environment influenced by
culture - an environment influenced by generally accepted practices and one
which permits certain roles for the organisms in it (Trist, n.d., p. 15). Trist
saw the problem as 'neither that of simply "adjusting people to tech nology
nor technology to people" but organising the interface so that the best match
could be obtained between both' (Trist, 1973, p. 103).
350
Praxis, a term used by Aristotle, is the art of acting upon the conditions one faces in order
to change them. It deals with the disciplines and activities predominant in the ethical and political
lives of people. Aristotle contrasted this with Theoria – those sciences and activities that are
concerned with knowing for its own sake. Both are equally needed he thought. That knowledge is
derived from practice, and practice informed by knowledge, in an ongoing process, is a cornerstone of
action research.

This ideal can be traced to John Henry Newman, who gave the title 'The Idea of a University Defined
and Illustrated' to a series of lectures originally given at Dublin in the 1850s. Newman thought that
knowledge should be pursued 'for its own sake'. But by this he did not mean pure research. For him
the search for truth was part of an educational ideal which shaped the personality of the cultivated
man, and was inseparable from moral and religious education. Newman thought that the personal gifts
needed for research and teaching were quite different, and that research was best conducted outside
universities. He also described the university as a place of 'universal knowledge', in which specialized
training, though valid in itself, was subordinate to the pursuit of a broader liberal education.

026463 412
'By autonomy is meant that the structure and organisation of jobs has meant that
individuals or groups performing the jobs can plan, regulate and control their worlds'
(Trist, n.d., p. 23)

Another stream of research into information systems was inspired by contingency


theory, as introduced by Woodward (1965), and which has come to inspire the
majority of the earlier literature on identifying user requirements (e.g. Ives et al.,
1983; Bailey & Pearson, 1983; Davis & Olson, 1984; Baroudi et al.1986). This work
explores the relationship between organisational structures and technical systems.
She revealed that organisational effectiveness was the consequence of a match
between a situation and a structure – part and whole.

There are also tools aimed at optimising technology with respect to addressing
consumer-user needs and requirements. This group includes Quality Function
Deployment (QFD), developed in the 1960s out of work conducted by Japanese
academics into research and development, design and manufacturing processes. 351 It
is also worth noting that ergonomic and usability testing of products also emerge
from the fields of physiological research, experimental psychology and cognitive and
computer science.

A more scholarly and sociological style of analysis can be seen in the work arising
from Science and Technology Studies (STS) school. Chief amongst this group are
perhaps the 'social shaping' (i.e. MacKenzie & Wajcman, 1985) or 'social

351
Most notably Professors Shigeru Mizuno and Yoji Akao. Their purpose was to develop a quality
assurance method that would design customer satisfaction into a product before it was manufactured.
The first large scale application was presented by Kiyotaka Oshiumi of Bridgestone Tire, which used
a process assurance items fishbone diagram to identify each customer requirement (effect) and to
identify the design substitute quality characteristics and process factors (causes) needed to control and
measure it. Earlier quality control methods were primarily aimed at fixing a problem during or after
manufacturing.

026463 413
constructivist' (i.e. Bijker & Law, 1992) perspectives. Both approaches are similar in
that they seek to explicate the relation of technology to society and social interest
and groups operating within society, and have presented this as a dynamic two-way
process of co-evolution and co-influence. Co-evolution is more than symbiosis,
where two or more species align in order to increase their own fitness. Species are
coevolutionary when they learn from one another across time, to the point that they
begin to fluidly exchange behaviors in order to increase the combined fitness of the
collective. One example of this can be seen in the relationship between lions and
hyenas in the African Savannah. The constructivist way of approaching the analysis
of technology development and the processes of innovation show how they may be
interpreted as matters of wider social explanation, in every way as relevant to the
success or failure of projects as purely physical and functional matters.

Coming from the field of cultural studies Johnson (1986) viewed the diffusion of
technologies as a process of semiotics, meaning-making and representation. Others,
such as Du Gay et al. (1996), have elaborated on this work establishing a notion of a
'cultural circuit'. This is viewed as perpetuating the development of, and subsequent
diffusion of, both ideas and actual products between various institutions and actors.352
[REWRITE AS I HAVE MOVED THIS]

Representation

Regulation
Identity

352
However the du GayConsumption Production
et al. model is a social and culturally-based perspective of products and their
diffusion. It neglects somewhat the concept of individual and pragmatic instances of use. In this
formulation use is somehow sublimated into the concept of consumption. Nevertheless, the elements,
institutions and forces which comprise the model, each plays a distinctive role in shaping one another
and driving what is has been termed as either ‘evolution’ or ‘revolution’ in technology and business
practice, and ultimately culture and the wider society.

026463 414
Fig. 2.1 Du Gay et al's 'circuit of culture (after du Gay et al., 1997)

Clifford Geertz's (1973: p.5) definition of culture as "webs of significance sown by


man," suggests the cultural circuit as the conduit between discourse and knowledge
of a product and technology. Grant McCracken also subscribes to a 'current' of
meaning which is " . . . constantly flowing to and from several locations in the social
world, aided by the collective and individual efforts of designers, producers,
advertisers and consumers." (p.71)

These authors convey a very live, dynamic process where 'meaning' is constantly
circulating, picking-up, laying the foundations for, and developing new forms of
interpretation and re-interpretation. Such processes can operate on micro (i.e. small
group) as well as macro (i.e. societal, cultural) levels. Of course it is true that
designers and engineers, although privy to exacting and specific knowledge
pertaining to their fields of expertise, are also members of wider society as fathers,
club members, supermarket shoppers and so on. Through conducting these roles they
are exposed, as we all are, to wider contextual knowledge which they can relate to
their work projects, to their own personal expectations of technical and social
outcomes.

Weibe Bijker (1995) draws attention to the design specifications of fluorescent lights

026463 415
catering to the needs of different interest groups rather than for performance or
economy. The 'dominant' design was a product of the Nela Park Conference - where
representatives of Mazda met with representatives of the electricity utilities - two
highly prominent and relevant social groups. Bijker suggests that the now familiar
lamp, its settled or crystallised major design feature, was indeed designed by the
managers around the conference table, rather than by engineers and designers in a
workshop:
"At that meeting a third fluorescent lamp was designed - not on the drawing
board or the laboratory bench, but at the conference table." (p.238)

What is important to stress from this observation is that managers’ inevitably bring
more to the table than discussions of merely technical propensities or standards. They
may employ business acumen or negotiate from a position of company mission. It is
as Rosalind Williams puts it: “Much of modern technological activity takes place in
the social context of meetings.” (Williams, 2000, p.

Design as a subject of analysis is at its most rich viewed as the interaction between
the various elements – social, individual and technical - at different periods in
development (or even depending on where the researcher casts, or is given
permission to - cast, their gaze). Pinch and Bijker acknowledge this by their notion of
sociotechnical ensembles. While maintaining a distinct prejudice for the social
dimensions in their accounts this acknowledges the notion that social and technical
elements constrain and shape each other in processes of development.

These approaches emphasise a contextualist approach that attempts to show; "the


internal design of specific technologies as dynamically interacting with a complex of
economic, political, and cultural factors." (Staudenmaier, 1985: p. 11) Current trends
in the history of technology tend to favour contextualist history. Such approaches
emphasise the particularities of the social and historical conditions in which different
technologies have developed. In so doing, they have avoided the excessively
deterministic implications of so many histories which focus on the technology as an
intrinsic phenomenon in itself. Contextualist history builds on an earlier
consciousness of technical differences as illustrated in more traditional historical

026463 416
accounts, but also reflects a concomitant awareness of how social factors influence
design and development.

Perhaps the most notable approaches in the category of more academically orientated
work includes actor-network theory - ANT (i.e. Callon, 1980, 1986; Bijker and Law,
1992; Latour, 1987, 1996) and the lesser known sociotechnical constituencies (i.e.
Molina, 1989, 1990). These approaches represent genuine attempts to straddle the
realms of the technological and the social for the purpose of creating more holistic
attitudes within the area of strategy or policy making. Both approaches warrant some
discussion of their major theoretical elements.

Actor-network theory

ANT is most prominently associated with the French sociologists of science Bruno
Latour and Michel Callon. It represents an attempt to describe how heterogeneous
human and non-human, social and material entities are related to one another within
networks, built and maintained in order to achieve a particular goal, for example the
development of a product.

At its most radical, it puts forward a stance that privileges neither humans nor non-
humans. In its analysis, both are viewed as equal actors and are treated largely in a
semiotic sense. However, in some cases such as robotics and in artificial intelligence,
and in terms of prosthetics and other mechanical, genetic or chemical enhancement
or alteration to human bodies, there is clear case where they are substituted

Beyond a kind of anthropomorphism with respect to physical objects, such a view


has been adopted by a number of commentators, serving as a useful
reconceptualisation of the relations between people and 'things'. In a companion
piece to Latour’s essay, “The De-Scription of Technological Arguments,” Madeleine

026463 417
Akrich goes even further toward granting machines an agency coequal to that of
humans.353 For instance, she argues that “technical objects participate in building
heterogeneous networks that bring together actants of all types and sizes, whether
human or nonhuman” (206). In this theoretical move, Akrich ascribes agency to
technological objects through their design, or their “script,” combined with the
designer’s “inscription” and the user’s “de-scription” (207-209). In the process of
inscription, “Designers . . . define actors with specific tastes, competences, motives,
aspirations, political prejudices, and the rest, and they assume that morality,
technology, science, and economy will evolve in particular ways. A large part of the
work of innovators is that of ‘inscribing’ this vision of (or prediction about) the
world in the technical content of the new object” (208). If users apprehend a
machine and act according to the “script,” they have performed a first order scan by
acting in conformity with designers’ wishes. Hence the designer has a primary role
in attempting to define the use of a product, but the designer never anticipates the
entirety of a product’s usefulness. Instead of adhering to a deterministic script “we
have to go back and forth continually between the designer and the user, between the
designer’s projected use and the real user, between the world inscribed in the object
and the world described by its displacement” (pp.208-209). This process of “de-
scription” is “the inventory and analysis of the mechanisms that allow the relation
between a form and a meaning constituted by and constitutive Igor Kopytoff (in
Kopytoff and Appaduri, 1986) for instance, has drawn attention to the 'biography of
the thing'. In this he suggests that an object, artefact, or technology, much like an
individual human being, can possess a biography. Throughout their 'lives' all around
them there is change and transformation, and they also change, through registering
these changes they "...can reveal the changing qualities of the shaping environments
through which they pass" (Silverstone et al., 1992: p.17). Stewart Brand's (1994)
How Buildings Learn is an excellent example of a semantic switch between the
primacies of human affairs over that of 'things'. Brand illustrates the transformations
of buildings over time and the changes that successive occupants make to it represent
that building's 'learning'. Such animist perspectives can highlight previously tacit,

353
Akrich, Madeleine. “The De-Scription of Technological Arguments.” Shaping
Technology/Building Society: Studies in Sociotechnical Change. Edited by Wiebe E.
Bijker and John Law. Cambridge, MA: MIT Press, 1992. 205-224.

026463 418
unseen relationships which are nevertheless important for understanding and
evaluating human relationships to the designed and built environment, and of course
the objects, messages and symbols that populate, denote and embellish it (cf.
Silverstone’s description of the market as a kind of jungle and the processes of
domestication as ‘taming’ objects).

The view of actor-networks has it that social, physical, cognitive and economic
elements fuse in a 'seamless web' where human action is constrained as much by
technological functions as social elements, such as the rules and regulations which
may govern use, or health and safety or what can be shown. On analysis this
approach owes much to systems theory, as another of its distinguishing aspects is its
stress upon the complete dependency of one 'actor' on every other 'actor' in the
network to maintain cohesion. Each part of the network is at the same time
representing several different smaller parts of a whole, while being minimised into a
small part of an even larger whole.

This is hardly a new idea as it relates to GST and to what Herbert Simon (1996)
suggested when he spoke of the 'sciences of the artificial'. This describes objects and
phenomena – artefacts – that result from human intervention in the natural world.
Material artefacts, the knowledge to build and use them, as well as the outcomes of
their operation and use – this means climate controlled air, through to cars and the
laws that governs their operation - are each examples of such intervention. Each is
equally as worthy as any other part in being considered in the operation and integrity
of an overall system.354

This is a very similar notion to another ANT theme – that of 'hybrids', or 'quasi-
objects', which can be simultaneously real, social, and discursive - ANT highlights
the significance of each of these in its analysis (see, e.g. Latour 1993: p.91). The key

354
As Norman (1993) says, "Without someone to interpret them, cognitive artifacts have no function.
That means that if they are to work properly, they must be designed with consideration of the
workings of human cognition." The distributed cognition approach (i.e. Hutchins, 1995) is concerned
with a wide spectrum of cognitive phenomena; from analysing the properties of processes of a system
of actors interacting with each other and an array of technological artefacts to perform some activity
(e.g. flying a plane) to analysing the properties and processes of a brain activity (e.g. perceiving
depth).

026463 419
term ‘actor’, for example, is not used as in conventional sociology where actors are
usually defined as "discrete individual, corporate, or collective social units"
(Wasserman, 1985: p.17), rather actors' identities and qualities are defined during
negotiations between representatives of human and non-human 'actants'. In this
perspective, representation is understood in its political dimension, as a process of
delegation. The most important of these negotiations is translation - a multifaceted
interaction in which actors:

 Construct common definitions and meanings:


 Define representativities, and:
 Co-opt each other in the pursuit of individual and collective objectives.

In the actor-networks, both actors and actants share the scene in the reconstruction of
the network of interactions leading to the stabilisation of the system. But the crucial
difference between them is that only actors are able to put actants – the non-human
elements - in circulation in the system, such as when a local government decides to
implement a new means of public transport. The development of actor- networks is a
concatenation of translations - effort by actors in the network to move other actors to
other positions, thereby translating these actors as well (i.e. the local government
bringing in engineers to implement the work). This concept of translation is the crux
of the ANT approach.

In essence, networks allow actors to translate their objectives, be they conscious


human choice or prescription of an object, into other actors and adding the other
actors' power to their own. "By translation we understand all the negotiations,
intrigues, calculations, acts of persuasion and violence thanks to which an actor or
force takes, or causes to be conferred to itself, authority to speak or act on behalf of
another actor or force." (Callon and Latour, 1981: p.279) Networks emerge and are
shaped by aligning more and more actors. In this way a network and an individual
actor can develop and grow. The importance of an actor depends therefore on the
number of other actors within his/her/its networks which he/she/it can employ to a
particular purpose.

026463 420
The size and shape of actor-networks is not a priori but the result of a long
development. There is no fundamental difference between a large structure and a
small structure; the only difference is in the number of actors that can be employed.
It is a mistake to take differences in size of a network for differences in level,
because networks always connect at the same time what conventional sociology
differentiates into micro and macro levels. This interconnection renders such a
distinction less significant, because "that which is large is that which has successfully
translated others and has therefore grown. Since size is nothing more than the end-
product of translation, the need for two analytical vocabularies is thus avoided."
(Callon, Law, Rip, 1986: p.228)

Networks are made up of network-actors that are always localised yet these networks
can extend around the globe. Networks can be so large and stable that they appear to
be independent from the actors (such as technical standards). This, however, is a
misconception. While they can (and do) seriously constrain the range of action for
certain actors, they always need actors. Any given actor might be replaceable, but
only by another actor. There is, therefore, no gap between the individual and the
structure that is made up of individuals, which are made up of structure, which is
made up of individuals, and so on, endlessly. For Bruno Latour (1993; p.122); " . . .
the two extremes, local and global, are much less interesting than the intermediary
arrangements that we are calling networks."

The construction of a common sense of purpose, utility, and benefit of a system or


artefact is one aspect of the sociotechnical negotiation. Pinch and Bijker (1989: p.28;
and Orlikowski, 1992: p.403), discuss the interpretive flexibility of artefacts. All
artefacts are open to various readings over their development. Social groups with
various interests and resources attach meanings to artefacts. These groups shape and
reshape artefacts through the construction of problems posed and solutions offered
by those artefacts. Eventually both artefact and meaning are stabilised through social
negotiation. Once developed, technological systems and artefacts become reified and
institutionalised, and lose the connections with human agents that gave them
meaning and sense (Orlikowski, 1992: p. 404). The notion of interpretive flexibility

026463 421
brings social constructivist views of technology closer to media, cultural and
consumer studies. At this time were also coming to studying less what media 'does to
people' but more how consumers/people actively appropriate, and thereby change,
products and services through their kinds of integration in daily life and routines
(Silverstone and Hirsch, 1992, Morley, 1992, MacKay and Gillespie, 1992).

One of the failings of ANT, and other constructivist approaches, lies with the fact
that its openness as a system of analysis suggests no real way of mapping or showing
the weight of nodes within the network. What one is left with is rather a 'fog' of
images and perceptions regarding a technology, because what is missing from an
analysis founded on semiotics is that motivations and other inertias to technology
development are relegated in favour of outcomes or effects.
"Actor network theory is a ruthless application of semiotics. It tells that
entities take their form and acquire their attributes as a result of their relations
with other entities. In this scheme of things entities have no inherent qualities:
essentialist divisions are thrown on the bonfire of the dualisms. Truth and
falsehood. Large and small. Agency and structure. Human and non-human.
Before and after. Knowledge and power. Context and content. Materiality
and sociality. Activity and passivity. In one way or another all of these
divides have been rubbished . . . it is not, in this semiotic world-view, that
there are no divisions. It is rather that such divisions or distinctions are
understood as effects or outcomes. They are not given in the order of things."
(Law, 1997, emphasis in original)355

The problem with such a relativist view from the perspective of practically
researching networks is that the notion of defining dualisms from the actors
perspectives places enhanced stress upon accurate rendering of actors ‘outcomes’. In
essence this is what is emergent, over and above technologies and the construction of
supporting institutions and organisations. This is basically a problem of
interpretation. This is always problematic, especially if we follow, Ricoeur, (1971)
when he writes that there may be a;
" . . . specific pluriocity belonging to the meaning of human action . . . human
action too, is a limited field of possible contradictions . . . It is always
possible to argue for or against an interpretation, to confront interpretations,
to arbitrate between them and to seek for an agreement, even if this
agreement remains beyond our reach. In the final analysis differences in
interpretation can only be arbitrated by applying socially accepted modes of
355
http://tina.lancs.ac.uk/sociology/stslaw3.html

026463 422
justification; i.e., what will count as a convincing argument." (p.331)

As will be described later in the chapter on methodology one of the problems of


phenomenological investigation is the researcher ‘bracketing’ or consciously
acknowledging prejudices and subjective interpretations in the research process.

One set of questions that remains vague in ANT is how to limit the analysis; where
does one network end and the next one begin? The question of boundaries to
contextual studies is cited by Morley (1992: p.187) when he considers the pragmatic
aspect of ethnographic studies of television viewing contexts. " . . .which elements of
the (potentially infinite) realm of 'context' is going to be relevant to the particular
research at hand." The opposite is in a sense also true for actor-networks. In effect is
the network which comes to be realised a 'fair' representation? Bijker and Law are
not absolutely convinced; "in effect it (actor-networks) rests on a bet that for certain
purposes some phenomena are more important than others. It simplifies down to
what it takes to be essential." (1992: p.7)

Sociotechnical constituencies

Sociotechnical constituencies (Molina, 1989, 1990, 1993, 1994, 1995), as a theory


and framework, shares much with ANT and the other social constructivist analyses
of social and technical relations in that it considers in its analysis the plethora of
social and technical elements constituting a development or project. But, unlike
ANT, it strives to distinguish, delineate and taxonomise social and technical
elements. Molina, following other critics of actor-network theory (Clark et al., 1988)
argue that this methodological stance dissolves the understanding of technology
which most people employ in assessing its importance on work and organisations. To
do this is to render it nonsensical.

Constituencies take the brave step of structuring and pre-figuring contexts and
hierarchicies this approach uses a priori theoretical considerations to group the
criteria into sets and determine the order of entry of sets, entering variables in a
logical order with those being considered exogenous first. In the case of

026463 423
constituencies this is often technology. It recognises that at a macro level social
institutions may evolve at a slower pace than technology development - an institution
understood as a persistent structure of human relationships (e.g., Powell and
DiMaggio 1991), but alliances, partnerships and deals can be easily broken in a
single meeting with severe consequences to a business. Technologies on the other
hand always exert certain continuities of path-dependencies. Molina sees the
mitigation of distinction of the technological as a hindrance to advancing social
studies of technology towards any aim of informing strategy and policy decisions.
Molina's (1995: p.23) own definition of sociotechnical constituencies is:
" . . . dynamic ensembles of technical constituents (tools, machines etc.) and
social constituents (people and their values, interest groups, etc.) which
interact and shape each other in the course of the creation, production and
diffusion (including implementation) of specific technologies."

Constituencies are built by particular social actors – constituency builders –


champions who manifest particular traits and behaviours and who wish to directly
exploit and develop the particular features and unique value propositions of certain
technologies.

Overall, constituencies provide a comparatively more systemic or structural approach


to understanding how the various elements comprising the constituency of social and
technical elements can interrelate. The main benefit of this approach, over actor-
networks are that the common areas of 'constituency' (or network or system) analysis
are denoted and largely pre-figured. This helps format, target and focus research
activity according to particular purposes, and most importantly across cases.

The formulation of typologies is a familiar activity in social science research. A more


formal definition is that a typology partitions events into types that share specified
combinations of factors. (Stinchcombe, 1968: pp.43-45) The power of typological
theories and systems is that they comprise a number of contingent generalisations,
allowing the researcher to move across cases with ease and make realistic
comparisons. Classification systems operate very many like typological theories.
Kwasnik (1992) states that:

026463 424
"Classifications are really very much like theories. Like theories,
classification schemes can provide an explanatory shell for looking at the
world from a contextually determined perspective. Classification schemes not
only reflect knowledge by being based on theory and displaying it in a useful
way...but also classifications in themselves function as theories do and serve
a similar role in inquiry." (p. 63)

However, the main shortcoming of this approach is one it shares with many research
approaches and frameworks that offer prescriptive classification typology systems.
Classifications have structural properties that lend themselves to representing
knowledge in a given format or pre-ordained way. As such they may act to mitigate
observance or proper identification of anomalies or phenomena that defy 'pigeon-
holing' style of classification of socially defined forces. Bennet and George (1997)
see that in the early stages of reflection and research on a complex problem, the
investigator will be reluctant to begin comparative study by attempting to build a
research design and select cases based on a full, logically complete typology, such as
that suggested by sociotechnical constituencies. The use of case studies for the
development of typological theories, and the use of these theories in turn to design
case study research and select cases, is iterative processes that involve both inductive
study and deductive theorising.

An a priori, "logical" approach to typologising outcomes of efforts to achieve


deterrence is likely to settle for a simple distinction between 'success' and 'failure'.
But such approaches may nevertheless be important, defining, unique or even
crucial, features of socio-technical processes. Opposed to actor-networks, which tend
to be internalist approaches to studying socio-technical phenomena, sociotechnical
constituencies is an externalist, often contextualist framework which is constantly
identifying a particular element of actors place within an industrial or even national-
level perspective. The contextualist approach is closely connected to the conduct of
case studies (Mjøset, 2009, p. 46).356 It can be handled at the outset of the research in
a more open-ended way, allowing the development of a typology and associated
theory. That is, new cases that are studied may lead to identification of new types of
'ucc' or 'failure' viewed from the goals of the various actors involved (in the
356
Mjøset, L. (2009). The contextualist approach to social science methodology. In Byrne, D. &
Ragin, C. C. The SAGE handbook of case-based methods (pp. 39-68). London: SAGE

026463 425
Cambridge case managers, designers and trialists).

Constituencies theory attempts to express the multitude of forces which cumulate in


the success or failure of a product vision throughout its production, and to a lesser
extent, its diffusion. It is an attempt to illustrate the way in which it expresses the
dynamic in which both social and technical constituents combine and mesh to shape
each other in the process of creating, producing, and diffusing specific products and
services.

Technical constituents of technological systems such as the Cambridge Interactive


Television Trial, rely strongly on technologies that are themselves the products of
other sociotechnical constituencies. PCs for instance are ensembles of technologies
(microprocessors, motherboard, RAM, hard drive etc.) often produced by different
manufacturers, which, on being compatible and obeying standards, work in concert
to produce a working machine. An overview of a much generalised constituency is
shown below.

026463 426
Industrial/market level

Collaborative/competitive interaction
Intra-organisational level

Interacting
technologies
Industrial Nature of Organisation
Players’ perceptions
Trends &
Standards
technology & use
value T and goals and consumers

Organisation’s
governance

Inter-organisational governance

Fig. 2.2 Sociotechnical constituency with technology at the centre (after Molina, 1989)

The individual components in the above diagram each play key roles in defining and
shaping a technology ('T' in the centre) from its inception (i.e. industrial trends and
standards, market and consumer trends), through its development and production
(drawing more upon more immediate influences – i.e. the organisation's governance,
availability of 'off-the-shelf' components and so on).

As mentioned earlier technical components are often developed on an individual


basis by multiple producers which have themselves evolved through the development
of their own constituencies - not least the social construction of standards which
ensure their fit with other PC components. This suggests the generative aspects of
constituencies, which truly defines them as different in structure from networks or
even systems, which suggest fixed, persistent, identifiable nodes and relations. It
recognises the socio-historical forces shaping technological development, trends and
innovation, and this takes precedence over issues and identification of instances of
translation. It accepts prima facie the existence and influence of pre-existing
sociotechnical structures in defining and inspiring the new. It also recognises that
there is a dynamic shifting in the influence of each of the elements over time, due
largely to the waxing and waning of the various 'sub' constituencies which contribute
to the constituency under analysis (for instance the constituency of interactive

026463 427
television set top box technology may be dramatically influenced by a change in
either video card technology, or in regulatory policy or by a take-over of the
company). A useful analogy may be to compare constituencies to wave theory as
opposed to the more rigid network structures characteristic of ANT.

Constituency 5

Development of video
Constituency 4 standards such as MPEG
PC market development
and the rise of the Internet
and e-commerce

Constituency 3
Video Chips

T Chips develop by
new
breakthroughs in
material science
Constituency 2 and machining
Constituency 1
Interactive Video Cards
television STB
Video card
Video card development is
development inspired by new
enables and chip developments
inspires STB and by video
constituency standards such as
mpeg

Fig 2.3 Meshing of constituencies as they develop, creating 'ripples' of cause and effects,
inspiration and shaping.

By considering the temporal element in the biography of a technology's development


one can map how particular elements comprising the constituency waxes and wanes
in influence (Kopytoff and Appaduri, 1986). Rather than proceeding as if individual
perceptions and actions can be held constant for the purpose of analysis or that
individuals are motivated by a single goal (e.g., profit maximization), analysis
focuses upon how and why perceptions and action are continuously changing
through time and space.

Molina (1997; p.2) suggests that in ANT, the juncture where the social encounters
the technical; "needs to be discovered every time as results of the perceptions and

026463 428
opinions of other actors." Actor-networks deny the possibility of any inputed
characteristics being applied to either social or technical actors or actants – the non-
human elements - for the purposes of analysis.

Indeed, if one considers the claims of chapter one - that we are not surrounded by an
infinite variety of media technologies, but really a delimited number of variations
which exhibit certain common features and functions – to start from scratch in
investigating their function and how it comes to be encountered as a phenomena,
seems a very long winded approach to studying technology. Beyond wide screen,
stereo sound, teletext and colour, the television-as-technology has changed little over
the years in terms of basic function, certainly at least, from the user perspective.

This is an important aspect in Molina's sociotechnical model. Technologies,


especially complex technologies, do maintain certain stability, or at least some
generic elements by which they may be differentiated according to how they may
function and perform. This also to a large extent dictates how they will 'fit' with
perceived and anticipated use. For instance they must be robust enough to handle
their perceived day-to-day use and usage. A critical issue with the Om STB and
remote control technology was that it be robust for the mass market. 357 There was also
found to be a lack of 'fit' between remote control functioning and some of the content
elements. The button layout and ergonomics worked fine for VoD use, but were
unwieldy and practically unusable for games use. For Molina, the observable
network, that is, the architecture of relations at each moment in time, contains an
expectation of its future operation. This is often lost in ANT style studies, and so
there is an argument that sociotechnical constituencies theory offers an extended and
more systematised account of the way in which actors' perceptions and visions meld
as social and technical elements combine. There is also criticism of the notion of
'actants' [the non-human 'actors'] in ANT. Molina sees them as only realised as the
result of the perception and opinions of other [human] actors, and not truly valid as

357
On one ‘long haul’ trip to New Zealand, Om had to send a technical specialist accompanying the
sale staff equipped with a demonstrator prototype STB. This was early in the trial when technical staff
were at a premium and largely indispensable. But he was viewed as insurance against any breakdown,
and indeed, on arrival there was indeed a serious problem that required the engineer’s expertise to fix
before the sales team were able to successfully show the prototype.

026463 429
having some king of equivocal status to that of sentient human beings. He also sees
this as a reductionism or marginalisation of the role of the technical in the network,
as it is reduced to the level of a ‘black box’. 358 This classification does not extend
however beyond networks because;
" . . . people (who constitute networks) are different, then we are constantly
back to square one with every new process of technology development
confronting the analyst. In short, a recipe for blindness regarding 'the
technical terrain' and for irrelevance regarding this particular dimension of
technology strategy." (Molina, 1997: p.4)

If one considers Fleck (1988) it becomes clearer exactly what Molina is suggesting
here. Fleck defines four generic forms that are consistent with different theories of
innovation and technology. These are discrete technologies, component technologies,
generic system technologies, and configurational technologies. Discrete technologies
function as self-contained packages and require no further interfacing with other
elements to make them functionally relevant. A mature and simple technology such
as a tin opener performs a straightforward function that needs little in the way of
user-led innovation. Its use can be easily pre-figured and anticipated. Component
technologies function as part of a relatively stable system, and as such, the system
characteristics and requirements guide and help denote aspects and specifications.

Generic system technologies refer to "complexes of elements or component


technologies which condition or constrain one another, so that the whole complex
works together." The main difference here is that changes to specific component
parts may lead to requiring change in the overall system. And configurational
technologies are more mutable kinds of systems, in an early stage of evolution,
which will be more fully developed and defined through their situating in intended
working environments. The agency of users can be viewed relative to the
358
The term ‘black box’ is interesting in the context of this chapter. The term is used in electronics to
describe a unit "whose circuitry need not be known to understand its function." (source: Collins
dictionary). In the computer industry it refers to a device designed by another company for general
industry use, for example an audio card. Metaphorically, it is used for a specific kind of abstraction,
where none of the internal workings of something are visible, and that one can only observe output as
reaction to some specific input. It also hints that the workings of the box are not of research interest,
either being already well understood and proven, or perhaps, not considered at all necessary or of
consequence to understand.

026463 430
stabilisation and standardisation of the system and its components (which is largely
the position of sociotechnical constituencies).

In summary, sociotechnical constituencies approach offers a framework which has as


its focus the fact that technologies come into the picture of development with legacy.
This is in terms of its functional characteristics, as well as more macro level
attributes such as their adherence to standards etc. Social elements wishing to
converge upon the technologies must do so in adherence to these legacy aspects, and
so there is a primacy or biased weighting afforded to technology over, the other more
fluid 'softer' parts of the development process. This is where it differs from ANT,
which prefers to visit anew the process of development and makes a conscious effort
to not privilege one part of a system over another. What is emergent maps not only
areas of where dualisms arise in the discourse and action of network elements, but
more pragmatically where technologies derive their main impetus for being, and how
institutions and organisation succeed or fail to support this being and continual
development.

The constituencies of the Cambridge Trial

The case outlined later also goes some way to illustrate the 'parts off the shelf' nature
of the set top box, with parts and people (expertise) drawn from previous and on-
going projects put together to form a singular new product. Also, the consortium of
companies which came together to form the technology partners, also suggested the
rationalised interdependence of people and technology who were invited to take part
in the Cambridge Trial. One of these partners - a cable company – represented not
only a communications infrastructure but also a significant subscriber base of
potential users or trialists.

Constituencies, or actor-networks come to that, do not occur within, or arise form, a


vacuum, nor do they emerge merely the result of industry trends, but rather through
the work and agency of constituency builders. These are institutions or individuals
who draw together the partners and elements necessary for technical developments.

026463 431
Om was the constituency builder of the Cambridge Trial. And from within the trial
came two major constituencies. One addressed technology, whilst another addressed
the development of content and services. Their roles were critical in that they
consolidated and negotiated between partners in what were essentially two
concurrent constituencies, each of whom were often chasing or desiring very
different outcomes and goals. In essence they were to act out the role of the BBC in
relation to the technical systems of Baird and EMI in the early days of television.
Once the technological system stabilised in a particular direction, then new forms of
content and services would be developed by service providers.

Constituency builders are similar to the more familiar roles of product or project
champions that appear in the management literature, except that their main focus
tends to be on the creation and propagation of wider networks of technical and social
elements. In other words, they generate content and context for social and technical
change. When constituencies confront and combine, the nodes illustrate spaces of
possibility and tension between the differing constituencies. In the case study this
may be where the original demonstration box (created within the constituency
generated by Om's managing director Dave Swallow, who may be considered as the
overall constituency builder for the technology) influenced and drove the content and
services visions of Marcus Penny (senior manager responsible for content and
services).359 However, later technical orientations of the box towards CD-ROM
driven devices and so on, created a tension between the two constituencies in Om. It
was indicative of the split between economic realities, market conditions,
technological development, and the innovations of services and content.

Constituencies as complex and dynamic entities emphasise "interrelation and


interaction" between the various elements motivating and creating the conditions for
technological development. They are constantly evolving and changing networks of
interrelated elements or nodes, some of which can dominate at times, remain
359
It would be important to stress here that Om, an operating arm of Acorn Computers, was primarily
a technology company. Content and services were viewed as an integral part of the technical
proposition of a set top box, and to an extent and inevitable part of the ability to illustrate its value.
But Acorn had already seen the value of selling software for its BBC computer, and realised the
potentials which lay in the selling of software. However, as the case illustrates later providing content
for interactive television represented its own unique and in the end, insurmountable challenges.

026463 432
constant, be sublimated against other elements or indeed drop out completely. One
may consider or stress either the social or technical contingencies in an analysis of
technological development whilst maintaining the understanding that it is both, in
concert at every given moment that provide an index of the growth or decline of a
product and project.

As social and human elements in a constituency are dynamic and always changing,
likewise technical elements evolve from their inception as discrete developments to
mature and stabilised products, but today more so than ever, products and services
must constantly be innovated in order for companies to remain on top within their
market sector.360 This dynamism is what guides and gives the constituency its shape
and direction. Similarly systems, as more 'macro' versions of individual products go
through similar evolutionary phases. They too often have a chaotic inception
(arguments, dis-aligned views etc.) that eventually converges to stabilise. However,
this stabilisation is a correlate of the alignment and adherence to standards and co-
operation between system components and the firms that make them. Also a system
may be comprised of constituent parts, some of which may evolve faster than others,
leading to a destabilisation within the constituency. This is where the ambitions and
objectives of the companies making these parts suggest a shadowing of the
advantages of remaining part of the system sustaining the consortium and developing
their technological contribution. This is the underlying principal that fuels many of
the new media, so-called 'converging alliances'.

Chapter discussion

It was only very recently that researchers have come to explore the broader range of
cultural practices associated with the design and use of computing in the workplace
(i.e. Star, 1995). Systems’ thinking is becoming more pervasive generally in the way
in which people design and consider technologies and services, and other kinds of
structural processes. Such analysis must not only consider analysis at the social and
technical level, but also indeed include outcomes or experience, as an integral aspect
360
Amongst the reasons for this is what Drucker (1959: p.23) drew attention to almost 40 years ago;
“The only protection against the risk of exposure to innovation is to innovate. We can defend
ourselves against the constant threat of being overtaken by innovation only by taking the offensive.”

026463 433
of development (in a similar vein to the distributed cognition approach of Hutchins,
1995). This includes the usability of services, of fulfilment, the 'ergonomics' of the
social and organisational fulfilment. Computers are no longer aimed at being
surrogates for human tasks. When they are used to deliver a service, for
communication, for entertainment the focus shifts to the notion of quality and utility
at the individual and group level. Quality of service cannot be captured by technical
function. It includes a wide range of social roles and provision – both in front of the
screen, and beyond it through the technical and social networks of users, service
providers, and suppliers/distributors etc.) (see below).

Manufacturers,
merchants, and i-Tv system
vendor distribution How a purchase
and value webs may impact the
wider social
networks

Family, friends,
work colleagues,
etc. – social webs
Products or
services
relevant for Individual
the individual Interaction of an
consumers i-Tv user with
and their product service
networks providers

Deeper into supplier More pervasively into the social


constituencies or networks networks or constituencies of
- their social and technical individuals - their 'social life',
networks interests, family and friends
Fig 2.4 How an episode of interaction with the system has 'knock-on' effects in 'front of '(in
terms of an individual's social life, as well as 'behind' (in terms of creating work, changing
inventories) the screen

The major difference between researchers trying to capture a complex system and a

026463 434
child playing a simulation game such as SimCity, is that in the real world there are
often no underlying heuristics or 'rules of thumb' one can realise properly, and apply
so they can 'win'. The game, like the television programme, is designed for an
audience of users, once the pattern is learned, and a ‘meta- perspective understood,
the game is cracked. Real world phenomena are always designed towards some other
purpose or function, and the main task of research for evaluation or for innovation is
exploration of the elements and how they come to be understood from the
perceptions of those involved.361

Actors, actants, constituencies are dynamic, and are so at different intensities. They
will not stop or slow down for the convenience of analysis. So regarding the
Cambridge Trial, a question arises whether one can one truly trial or simulate
networks or constituencies? The simple answer to this question is most likely no. In
many, particularly social, senses, they have to be self-forming. Social systems are not
only self-organising systems as has been stressed for example by Luhmann (1984);
they are also adaptive systems which means that they are able to change their rules of
interaction according to particular demands of their environment. Luhmann adopted
the theory of autopoiesis proposed by Humberto Maturana and Francisco Varela
which differs from the Bertalanffy tradition of open systems in looking at systems

361
Maxis software produce the Sim series of computer games. A genuine innovation, which harnesses
the power of computing to generate and run simulations of cities, people, insects and worlds. The best
known of the series is SimCity, where a player experiments with urban development. A housing
development placed here, a police station there, a stadium here, an electricity station there . . . the aim
is creation of a self-sustaining whole, capable of maintaining itself and prospering. Certain implicit
rules are built into the game, and it presumes some knowledge of urban development and governance
on behalf of the player, who as ‘mayor’ adjusts tax rates for citizens and dictates development policy.
Raising taxes too high, not building and distributing enough fire stations, or perhaps neglecting proper
sewage or electricity infrastructures will lead to problems. Natural disasters also spontaneously
intervene in the development of the city, and one must cope with them by making proper additional
changes. During periods that the ‘mayor’ is absent from his office (i.e. when the computer is switched
off and one is not playing the game), events and interactions between events in the game continue to
unfold. SimCity acts then as a concurrent reality for the player. There are little options to ‘win’ but
through careful consideration of the many different factors and influences, the underlying models and
rules of the program. By discovering them the player can control whether the city prospers or falls
into decay. Millions of adults and children are working out the underlying strength and weaknesses in
such models, and via dedicated web sites exchanging tips on how to ‘win’.

Patterns in how people play SimCity offer scholars unprecedented insight into how non-designers
conceptualise urban space, if only because for the first time in history, city planning as both a concept
and an activity has been made readily accessible to people who probably never had the opportunity to
think thought actively about city form at all. For most, the city as a total system was simply beyond
their control or outside their intellectual purview.

026463 435
(e.g. cells) as being completely closed on the basis of their own production
processes. Whatever they consist from – elements, structures, processes, boundaries
-, systems are conceived to produce all their elementary constituents by their own
production processes. Luhmann connected this hypothesis to communication theory.
He then described social systems as autopoietic communication systems which
always produce and reproduce a specific type of communication (e.g. payments in
the economy, published truth claims in the social system of science) and which do
this only on the basis of processes internal to the system. At the same time he held to
the primacy of the system/environment distinction. Regarding autopoietic systems
this means that for them, too, it is true that they only can continue their processes of
production and reproduction of their components if they incessantly observe their
relevant environments and generate information instructive for their production
processes on the basis of these observations.

The term of adaptation is to be understood in this context rather generally and not as
an antithesis to self-organization: Each adaptive system is also self-organizing in the
sense that it always operates according to its own logics. Environment can force an
adaptive system to change its rules of interaction yet the manner of changing is part
of the self-organization of the system and not a simple reaction to the environment.

The reason is that even with strong governance organisations, regardless of style,
maintain distributed and localised facets. One fundamental difference between the
social and the technical lies in the fact that while one can prototype a technology, and
have people provide feedback on its use and value, one cannot easily prototype a
social group. Social groups are contingent on a variety of perceptions, needs,
motivations and so forth which are very complex in nature, contingent upon diverse
motivations and drives, and privy to a wide range of forces and constraints, for both
individual members, and for the group as a whole.

It is perhaps here that the notion of a constituency or actor-network falls short in


providing a proper prescriptive treatment of dynamic development processes. Are
these frames of analysis able to account for such adaptation and reflexivity at the

026463 436
individual level? Most likely not. They share the same limitation as most ‘big
picture’ sociology, in that the analysis falls short on understanding the underlying
reasons for change, or indeed the relative power of one actor, actant and constituent
over that of another to invoke or engender change and its dynamic. Although
constituencies do highlight and emphasise the role of the constituency-builder in
alignment processes, they do not attempt to understand the relative power and
influence in social and technological terms.

It may be better to think in terms of trials as a kind of biosphere where technologies


and social circumstances are cultivated within a view that must accept the need for
creation of context and the right environment. The familiar ecosystem concept -
which connects a biota with its physical environment through transformations of
energy, matter, and information - could be used to integrate humans and technologies
with their environment. Transformation of energy and use of resources in human-
dominated systems depends on the social features of humans as well as the physical
environment and the other biotic components of ecosystems (Burch and DeLuca
1984).

Vast differences lie in the perception of a product between the various social groups
that design, produce and provide goods and services, and the other 'group' that comes
to use and consume. This can deny any opportunity for social group self-formation,
particularly in the early stages of a shared exploratory exercise (such as was the
Cambridge Trial). As Grudin (1986) suggests, that designers are less able to grasp
'user logic', and tend to rely on more familiar and immediate ‘logics’ – what Araya
(1995) terms as "technical thinking" - that are useful in other problem-solving arenas,
such as software or interface design problems. Certainly, Sharrock and Anderson,
(1994) suggest that even when there is no direct contact with users, they often remain
as a 'scenic feature' of the design process.362

i-Tv is a classic example of a system technology, a whole configured from a range of


discrete technology elements – authoring technology, switching technology,
362
They also suggest that amongst other things ‘users’, there may operate within design processes as
‘rhetorical devices’ – “ . . .being able to couch one’s proposals in terms of user considerations is a
powerful way of ensuring their acceptability.” (p.16)

026463 437
transmission technology, set top boxes and other elements. However, being a media
system it must also be able to convey through its technological aspect, a further
'system' of content material. This content material, rather like individual components
of the technological system, is often produced by various third parties, but these must
adhere to certain rules, conventions or genres to be acceptable for transmission. Only
when both the technological and content systems successfully combine will any
recognisable and useful system be apparent for the wider constituency of
stakeholders such as content producers, service providers, retailers, advertisers, and
of course consumer-users. It is these stakeholders and others such as regulators,
government, competitors, service providers, carriers, press and so on – who comprise
the wider social constituency of interactive television systems.

But central within these social constituents, it is the consumer-users, their perception
of the combined system that remains critical in the assessment of the overall result of
the combination of service and technology. From their view it must be capable of
presenting an informative, relevant, useful or entertaining array of content and
services. In many respects it should not be important for them to worry about matters
'beyond the screen'.

Since the time of Aristotle, people have believed that 'the soul' or 'the self' could
distinguish humans from non-humans. Lewis Mumford (1964) suggests that the
technological developments in the 20th century represented an increasing effort to
fully incorporate and assimilate disobedient humans into a system of machines.
Indeed the entire modernist project from the enlightenment onwards could be slated
as a attempt to objectify the human within machine and bureaucratic systems and
metaphors. Mumford's dystopia was clear: as technology becomes autonomous,
humans become mechanised. From the industrial age onwards came the tradition of
system design focused upon automation, machines that would replace human labour,
increase reliability, and raise productivity. But they still required human operators,
and it was here that the embryonic beginnings of ergonomics and human factor
engineering began. We know that our eyes are sensitive to wavelengths of light
between 390 and 700 nanometres—that is, from red to violet. By doing comparisons

026463 438
within that range, scientists have shown that we can tell the difference between 2.3
million and 7.5 million colours.

The same applies to sound. We can hear frequencies between 20 and 20,000 Hertz—
from four octaves below middle C to many octaves above it. Within that range, we
can discriminate between around 340,000 tones.

But colours and tones are easy to probe. Both vary along a single dimension:
wavelengths of light and frequencies of sound, respectively. Smells don’t have an
equivalent. They are complicated cocktails of molecules; a rose, for example, owes
its scent to some 275 ingredients. Levers, dials, buttons – some were fund to be
difficult to operate physically. “As physical beings, we are unavoidably enmeshed in
a world of physical facts. We cannot escape the world of physical objects that we lift,
sit on, and push around”(Dourish 2001, p. 99).Later, the perceptual aspects of
operating systems became of note, and later still social aspects.

But as producers were forced to change focus to domestic communications and


entertainment, new prerogatives was placed upon computer developers to form
partnerships with members of the creative and communications industries. Also,
designers and producers needed to understand wider social and individual exigencies
of the new genres of use, and more of the nature of domestic users. Now individual
experiential aspects are important (Pine and Gilmore, 1999).

Preece (1993) points out that many system designers pay only scant attention to the
'human element' of human-machine interaction, with users being regarded as capable
to adapting to the use of a system, 'like a cog in a machine'. Her suggestion is of a
strong determinism on behalf of designers, where 'users' and 'use' are a kind of
independent variable in a process of optimising machines for particular tasks. As
Margaret Wheatley (1992) sees it: "We have treated organisations like machines, . . .
We have magnified the tragedy by treating one another as machines." (p.77)

Mechanistic views of humans within HCI prompted Bjørn-Andersen (1988) to

026463 439
directly ask the question; "are human factors human?" in an attempt to draw attention
to the fact that humanising technology suggests processes that aim well beyond the
simple optimisation of technology. And the observations of Bannon (1991) also
brought him to question the human in 'human factors' research in their role as an
'independent variable' in a process of optimising technology. In a seminal paper:
From human factors to human actors, he writes:
"Within the HF (human factors) approach, the human is often reduced to
being another system component with certain characteristics, such as limited
attention span, faulty memory, etc., that need to be factored into the design
equation for the overall human-machine system. This form of piecemeal
analysis of the person as a set of components de-emphasizes important issues
in work design. Individual motivation, membership in a community of
workers, and the importance of the setting in determining human action are
just a few of the issues that are neglected. By using the term human actors
emphasis is placed on the person as an autonomous agent that has the
capacity to regulate and coordinate his or her behaviour, rather than being a
passive element in a human-machine system." (pp.27-29)

Notions of experiential qualities such as 'usability' 'usefulness' and 'use' surely cannot
feature when one fails to differentiate between human and non-human elements, as is
the position of ANT. Indeed, Callon (1998) puts forward that: "One of the oft-
mentioned shortcomings of ANT is the poorness of the analysis that it offers in
respect of the actor." Indeed, an approach of treating 'humans as parts of the system'
– as cybernetic users and consumers - is ironically similar to that of the early HCI
studies, or mechanistic perspectives of the organisation, or purely ergonomic views
of engineering, where the human element is considered the 'problem'. Actor-
networks is not alone in this style of analysis. Many earlier frameworks such as
suggested by Craven and Wellman in their discussion of The Networked City (1973)
characterised the social network approach by its analytical emphasis upon:
"The primacy of structures of interpersonal linkages, rather than the
classification of social units according to their individual characteristics . . .
[It] gives priority to the way in which social life is organized, through
empirically observable systems of interaction and reliance, systems of
resource allocation, and systems of integration and co-ordination." (pp.1-2)

Caven and Wellman view that the concept of networks is scalable on a whole
network level to a ‘network of networks’, network groups connected to other
network groups by actors sharing membership in these groups. This operates in a

026463 440
number of ways. People are usually members of a number of different social
networks, each based on different types of relationships and, perhaps, different
communication media. The purely structural arrangements between people and
people via communications networks may derive value from a purely structural
analysis. However, such analysis can only be performed upon networks which are
fully formed, sustained, and which have a perceivable structure. The structures of the
social and technical networks on the Cambridge trial were constantly shifting, as they
may do in many innovative, experimental-type development situations.

There are parallels here with the statistical treatments of the individual in audience
research (already cited in the previous chapter) and within economics. Miller (1995)
is one who, from the perspective of recent studies of consumption, criticises the
primacy of economics in its study of society. He sees that economics is a social
science discipline that "cut itself off from social studies," and this led to an abstracted
view of the world. (p.12) He sees that political decision-making and policy relies too
much on economics:
". . . the discipline of economics has achieved unprecedented power in the
world today in large measure precisely because it has justified the complete
neglect of the topic of consumption." (Miller, ibid.)

Predominately, his critique hinges upon how economists create an image of the
'aggregate' consumer.363 Actual consumption behaviour and choices give way to an
implicitly normative behaviour, representative of the rational decision making and
self-interest which drive persons to consume. ‘Research and development’,
'innovation' and 'markets' are each abstractions, myths, and can often obfuscate the
complexity of real-world processes and phenomena. As Wartofsky (1979) has it:
"... our own perceptual and cognitive understanding of the world is in large
part shaped and changed by the representational artifacts we ourselves
create. We are, in effect, the products of our own activity, in this way; we
transform our own perceptual and cognitive modes, our ways of seeing and
363
In audience research there is a similar criticism:
“The procedure is one of head counting and the purpose is to artificially convert the
many situated instances of consumption - ultimately unknowable in their totality -
into manageable, calculable units. As ways of comprehending the lived experience
of actual audiences, these methods would be doomed to failure. Within the logic of
the ratings discourse, though, an ‘audience commodity’ is created to be traded for
financial gain. Its fictionality, does not hinder its economic functionality.” (Moores,
1993: p.3)

026463 441
of understanding, by means of the representations we make." (pp. xx - xxiii)

This is true, even of those studies that wish to 'open the black box' of technology, as
they often 'close' or even ignore the 'box' of the individual designer or consumer-
user. For instance Westrum (1991: p.172) as an example points to the fact that "the
evolution and structure of a technological system are shaped by the social institution
that sponsors it." Yet in the same volume he draws attention to the potency of user
innovation. Economic determinism parallels technological forms of determinism in
that it closes consideration of the reality of use with relation to technology.

While marketing hype can serve to carve out a space for a new product or system in
the mind of the public, actually getting the technology to work according to original
design, albeit within certain tolerances, the need for 'tweaking', recalculations of
budgets, and/or re-appraisal of orientations and commitments, provides a necessary
stage where it can be tested. There are no more relevant testers than potential
consumer-users. Testing the technology using surrogate consumers – i.e. trialists –
provides an attempt to realise and benchmark further problems of deployment,
implementation and delivery of services. They may then wish to shift the emphasis
of the trial towards a marketing phase in order to gauge an idea of an appropriate
charging system and commercial potential for such services. Throughout these
processes, new partnerships and alliances may form, and others come to encounter
the technology and its potentials.

If the technology and its potentials meet expectations then a new successful product
is realised. If it does not then it may be relegated to the domain of expensive failures.
However, aspects or components of the system may be developed into separate
products which could be successful, or there may even begin further processes of
innovation and negotiation breeding a new version or revised version of the original
system.

Conclusion

026463 442
There is a need for the researcher to have worked out his or her theoretical approach
prior to entering the field. (Cicourel, 1964) However, going in with strong hypothesis
(or classification system such as in Molina's socio-technical constituencies approach)
regarding the outcome or content of what will be found is a mistake which may lead
to self-fulfilling prophecies. Conversely, going in without some clear idea on the
form and type of concepts that one is looking for (as in actor-networks which
purports to be entirely interpretivist in its approach) may lead to paralysis when it
comes to carrying out meaningful observation and integrating findings.

In sum, the systems approach, and its derivatives, has been established as an attitude
of mind to facing complexity; it reflects a search for the interrelatedness of things in
problematic situations. The purpose of actor-networks and socio-technical
constituencies is to map relations between elements, specifically for the purposes of
advancing academic theory and policy-making. Opposed to physical tools, such as
machines, and computers, these are cognitive tools aimed at expanding the horizons
of researchers, designers, producers and managers. They relate "ideas to ideas, ideas
to data, and data to data; they encourage team members to communicator more
effectively with each other." (Cohen, 1995: p.2)

Systems comprise 'hard' and 'soft' elements, social and technical, actors, actants or
constituents, function, expertise and experience. Some of these systems fall
somewhere in between hard and soft (included here can be style, whose creation can
employ considerable agency and resources, as well as impact greatly market
acceptance). It is possible, with some measure of objectivity to map changes in the
technological infrastructure of a network, the technical terrain of a given problem to
define and taxonomise elements in terms of functions, specifications etc. It is also
possible to map changes in organisation and to a lesser extent expertise.

Technological change or innovation today often occurs as projects such as the


Cambridge Trial - events happening within the other flows of normal business that
companies conduct. As such they do represent a bounded series of events and groups
of participants. Many of these projects are complex in organisational character

026463 443
involving various actors from many different organisations. Some projects 'spin-out'
becoming separate firms or as operating divisions. For instance ATML spun out of
Acorn, as did ARM Ltd. Om became an operating division with their own HQ based
in Cambridge Technopark. Projects are often also very complex with respect to the
technology involved.

Many components comprise the whole technical system or network. We must


recognise that some components may have distinct legacy in what came before, that
systems, perceptions even possibilities are in fact configurations. Their uniqueness as
projects is stressed by actor-network types of analysis; however their generative
function is stressed by sociotechnical constituencies. This was certainly the case in
the Cambridge Trial where the Om STB was comprised largely of 'off the shelf'
parts, the communications infrastructure was already in place and operating as a
successful business, and much of the basic content material was in an already
produced state. Other components need adaptation and development to respond to
some, as yet, unforeseen need of the system to effectively perform its operation and
function.

Within the Cambridge Trial this was creating the 'mortar' which would join the
components together and get them working as an effective whole. For instance the
content, comprising largely of the games and educational software), and the video
footage were drawn from Acorn's education division and Anglia Television
respectively. Development work was needed on both of these elements such as
'porting' the software to the system – making it work on the new platform and with
the remote control rather than a PC keyboard. The video footage also required
editing and digitalisation. The largest piece of 'mortar' work was the interface
development.

The interface has a dual-faced Janus-like quality. It is the site where not only do all
the functional aspects of the system's purpose converge in relevant, purposeful and
useful ways, but also the place where these functions are represented symbolically
user in a meaningful way to prospective users. This is a task not only of engineering

026463 444
but also of aesthetic ingenuity, employing considerable symbolic and semiotic
invention, social and cognitive sensitivity and understanding, and a feel for relation.
In addition to the interface, there were further content elements to be developed as
further partners joined the content and services group – the principal services
providers or PSPs. These included catalogue-style screens depicting goods,
interactive advertisements and the online surveys and questionnaires.

While technology and organisation may be mapped, to map the complex of


perceptions and influences that shapes a constituency or network is a more difficult
task. These lie very much in the 'soft' end of the spectrum of systemic elements. This
is the challenge of the contextual usability approach detailed later, which presents a
hermeneutic model of mapping perceptions of a product's characteristics, attributes,
feature and functions. ANT as a practice is inductive. It often involves prolonged and
gradual acquisition of knowledge, through induction, data slowly evolves into
concepts and specific research propositions through the investigators own increasing
skill and understanding. This characterises it from sociotechnical constituencies,
which carries the implication that all conceivable interactions are already partitioned
and classified, and it is simply a matter of 'filling in the spaces' so to speak. It is
comparatively deductive. But this is not unlike many approaches favoured by
industry. Quality tools such as QFD, and even usability engineering approaches,
offer prescriptions for industry practitioners who have not the time nor resources to
engage in a protracted research project. Here lies the difference between research for
academic ends and for achieving practical results.

Silverstone (1994: p.85) questions the relevancy of systems and actor-network


models, in their preoccupation with the processes of the production of technologies
(which they most definitely share with socio-technical constituencies and many other
social constructivist style studies i.e. Pinch and Bijker). Both actor-networks and
constituencies sublimate or 'black-box' the use process and any other form of
experiential aspects of the product. They intentionally do so to focus analysis upon
the relation of parts within the system, actors, actants or constituents that they come
to identify. Both are theories rich in concepts, and have both developed a specialised

026463 445
vocabulary. Most importantly both are also emergent, themselves under continually
development, and lack any real set of heuristics on 'how to do' or 'how to be applied'.
The leaning in this book is towards constituencies but it would be erroneous to
ignore some of the useful aspects of actor-networks that are missing in
constituencies. Most prevalent here is the semiotic aspects which drive and motivate
technical projects, while there is emphasis upon the strength of these aspects in actor-
networks, constituencies tend to incorporate symbolic attributes under the heading of
'perceptions' or 'perception building'.

However, one of the major criticisms is that the actor-network theorists make very
few references to sources outside the fields of sociology, and history of science and
technology. But the same can also be said of Molina's sociotechnical constituencies.
Both carry the tradition of general systems theory in that they wish to offer an
overarching explanation for processes that are indeed extremely complex in nature,
and difficult to explore in detail as well as depth. In they obey the tenets of a
methodological dualism which has pervaded the development of practical social
sciences, namely a dualism between generalizing (‘nomothetic’) natural sciences,
and specifying (‘ideographic’) social/human sciences (Mjøset, 2009, p.41), but like
their industry counterparts, they do provide insights which help to develop an
impression of the 'big picture', which in turn can stimulate the creative mind and
resources of the reader.

026463 446
Chapter 3 – The Contexts of Use
"What IS the context of design? I seem to have two answers: it is our minds, our
lives, as persons, as beings able to always imagine, to perceive, to remember, and to
be, beyond the range of whatever others and we may try to serve our need or to
control our behaviour can foretell; It is also 'evolution'. The so slow, and seemingly
mindless evolution of all things, natural and artificial, which so often seems to
exclude the 'most rational' or 'most intelligent' actions and to encourage what looks
like sheer stupidity." (Jones, 1991: p.204)

"Contrary to our normal ways of thinking . . . Openness to environmental


information . . . spawns a firmer sense of identity . . . high levels of autonomy and
identity result from staying open to information from the outside. We tend to think
that isolation and clear boundaries are the best way to maintain individuality."
(Wheatley,1992pp.92-93)

The useless

Hui Tzu said to Chuang Tzu:“All you teaching is centered on what has no use.”

Chuang Tzu replied: “If you have no appreciation for what has no use, you cannot
begin to talk about what can be used.

“The earth for example, is broad and vast, But of all this expanse a man uses only a
few inches Upon which he happens to be standing at the time.

“Now suppose you suddenly take away all that he actually is not using, so that all
around his feet a gulf yawns, and he stands in the void
with nowhere solid except under each foot, how long will he be able to use what he
is using?

Hui Tzu said:“It would cease to serve any purpose.”

Chuang Tzu concluded:“This shows the absolute necessity of what is supposed to


have no use."

Chuang Tzu (369-286 BC)


The Chuang Tzu, Ch. 12: “The Universe”
translated by Herbert A. Giles
George Allen & Unwin, Ltd., London, 1926, pp. 124-125

026463 447
(Quoted by Werner Heisenberg, The Physicist's Conception of Nature,
Hutchinson Scientific & Technical, London, 1958, pp. 20-21)
(quoted by McLuhan, M (1962) The Gutenberg galaxy: the making of
typographic man Toronto: University of Toronto Press, p.29-30)

Introduction

The previous chapter closed with the suggestion that holistic treatments of product
and service development processes should go beyond simple identification [or
classification] of social and technical elements (or constituents). It must also consider
their interrelation and interaction as producing outcomes, and perhaps more
importantly, emergent experiences.

This chapter is chiefly concerned with outlining the underlying principles of


contextual usability (CU) an approach developed concurrently with, and in relation
to, issues arising from the Cambridge Trial. CU was influenced by more general
shifts in epistemology and research practice that began to manifest in human-
machine interaction (HCI) research and social studies of technology in the early
1990s. At this time, these fields, and others such as cultural studies, audience and
consumer research were coming to place a greater emphasis upon microsociological
levels of analysis – i.e. focusing mainly upon individuals and groups. They also had
an interest in the influence of localised social environments, technologies and

026463 448
mediated information upon individual perceptions. To do this they were employing
the use of more interpretist, naturalistic and ethnographic approach in their study.

Within industry this change gave birth to the much-vaunted consumer-orientated


approaches in business practise and marketing, and to the similarly acclaimed user-
centred design principles within the development of technology and services.

Brach defines contextual knowledge as: "knowledge linked to the context in which it
is gained rather than formal knowledge." (p.2) And Erlandson et al., (1993) point out
that naturalistic inquiry is very dependent upon context:
"This stems from its fundamental assumption that all the subjects of such an
inquiry are bound together by a complex web of unique interrelationships . . .
[this] provides a context that at one time both restricts and expends the
applicability of the research . . . context provide great power for
understanding and making predications about social settings . . .
Interpretation is both limited and enriched by context." (pp.16-17)

While an outcome (i.e. profit, the development of skills or attitudes, or perhaps a


function or purpose) is something often shared between individuals, the nature and
quality constituting 'experiences' tend to be much more personal and subjective.
Following Woolgar (1996) and Mackay (1995) in their advocacy of extending the
social shaping perspectives of technology by applying some of the thinking of media
and cultural studies:
"Design and development processes may encode preferred forms of
deployment in a technology (via its technical possibilities), which are
reinforced through marketing. It is in this semiological sense that one might
propose that the technology is a form of text. . . . The subjective social
appropriation of a technology is . . . a crucial force in the social shaping of
technology – one which cannot be 'read off' from either the physical
technology or the social forces behind its development." (Mackay; p.45)

Who we are, what we know, what we are familiar with, and how we cope with
change all matter towards how we accept, or do not accept, the products they come to
be made aware of.

Why focus on users?

026463 449
But why focus on customers or users? What do they know of the difficulties of
design and risks of big development or advertising spends? Are they not waiting in
the wings with bated breath for new and enhanced products with wonderful new
features? Everyday on the high street, following rules of engagement which were
developed over hundreds of years, restaurants dictate via their menus, and shops
display their wares with little consultation with customers. They represent and dictate
what the owners/managers wish to offer, or what the buyers decide what is 'in' and
fashionable. Even 'post-fordist' manufacturers such as Benetton (see Livingstone and
Lunt, 1992) rely upon trends registered by sales, opposed to pre-empting what
customers like or dislike about existing products (I shall cover this more depth in the
following chapter discussing system-logging of use and consumption by digital
systems). As a consumer, one can choose to eat or buy at a particular restaurant or
shop or move on. Menus are shown outside and goods are on display. Managers may
even change their layout, staff, displays, and even the deco and arrangement of
furniture. There very rarely consult consumers, beyond ad hoc conversation,
regarding substantive issues regarding ‘content’. It is ‘take it or leave it’. Choice is
selection and consumption, not necessarily involvement in production. However, my
small, neighbourhood restaurant, particularly on identifying me as a regular
customer, may through conversation, be able to offer some customised’ dishes
catering for a particular fancy or food preference. Macdonald’s on the other hand,
while being able to offer some leeway regarding choice of how a burger is put
together will hardly cater for a wheat allergy by supplying buns made of rye bread,
regardless of how often I eat there. Why should it be different for technical
innovation? Indeed, in the case of major or 'radical' innovations, a number of
commentators have suggested that consumers may have difficulty linking
characteristics with outcomes (Westrum, 1991; Ortt and Schoormans, 1993).

Indeed Tauber (1974) puts the case against user-centred approaches, when he says
that most innovations, and the need for them, are 'beyond' the foresight of most
consumers. Perhaps good reason why, beyond lip service paid to adopting user- or
consumer-orientated approaches to service or design, many high-tech products rely
on presumptive consumer needs rather than on any 'real' identification of potential

026463 450
'buyers' desires (Shanklin and Ryans, 1984). Proctor and Williams (1994: p.4) see
that part of the reason for this is that: "end-users and designers don't inhabit the same
environment and share a common practice."

The are many convincing cases from the field of architectural design that clearly
illustrate the consequences of disregarding user or community needs in design.
Perhaps the most infamous of these was the Pruitt-Igoe Project, a product of the
United States post-war federal public-housing program. Completed in St. Louis in
1956, this mammoth high-rise development was an overly optimistic rationalisation
of social design principles, for the most part influenced by the radical French
architect Le Corbusier.364

Twenty years is a very short 'life expectancy' for buildings in view of the tremendous
investments. This particularly so if compared it with pre-modern architecture which,
in European cities, has survived over hundreds of years, and which remains a major
attraction for much tourism. But Pruitt-Igoe does live on. It does so symbolically as

364
Although the architect Le Corbusier never designed any buildings in the United States, the progress
of urban renewal in the U.S. enabled developments that greatly mirrored his style in their density and
open space characteristics – such as the Pruitt-Igoe developments. Hall (1996) considers the
meticulously ordered and clean environments of Swiss cities as a major influence upon the young
architect. Coming from a family of watchmakers, led to such declarations as "a house is a machine to
live in.” (cited in Hall, p.204) Hall suggests the problem of such an analogy “ . . . people are not
escapements, and society cannot be reduced to clockwork order.” (p.205) Hall further suggests that Le
Corbusier's vision was shaped by his life in Paris. He apparently hatched a plan to obliterate the
historic core of Paris north of the Seine, and replace it with eighteen, 700-foot high towers. His view
was that, " 'the design of cities is too important to be left to the citizens - we must decongest the
centres of our cities by increasing their density. In addition, we must improve circulation and increase
the amount of open space.” (p.207) The contemporary city planned by Corbusier;

" . . . was to have a clearly differentiated spatial structure. And this was to correspond to a
specific, segregated social structure: one's dwelling depended on one's job . . . The center of
the city would be the office towers of the elite, including industrialists, scientists and
artists . . . Twenty-four of these towers would provide for between 400,000 and 600,000 top
people's jobs at 1200 to the acre, with 95 per cent of ground space left open . . . Outside this
zone, the residential areas would be of two types: six-storey luxury apartments" for the elites,
"with 85 per cent of ground space left open, and more modest accommodation for the
workers, built around courtyards, on a uniform gridiron of streets, with 48 per cent left open."
(p. 209)

Le Corbusier had no time for any kind of individual idiosyncrasy; well did he call them 'cells'.
Likewise, the units "would all contain the same standard furniture. Possibly, he admits, 'my
scheme . . . at first might seem to warrant a certain fear and dislike.' But variations in layout, and
generous tree-planting, would soon overcome this.” (p.209) Quite obviously Le Corbusier’s vision
was not only of architecture but of social engineering. Hall sees that, “His simple-minded egomania
and his total political naivety made it difficult for him to understand his failure." (pp. 211-212)

026463 451
an icon of the failure of overly rationalistic and overly-presumptive design ideals.
Indeed, it is often taken as symbolic of the whole modernist programme in society. 365
Liberals perceive it as exemplifying the government's appalling treatment of the
poor. Architectural critics cite it as proof of the failure of high-rise public housing for
families with children (Hall, 1996).

Within the space of only a few years, disrepair, vandalism, and crime plagued the
development. The project's recreational galleries and skip-stop elevators, once
heralded as architectural innovations, had become nuisances and danger zones. Large
numbers of vacancies indicated that even those desperate for housing preferred to
live anywhere but the project. In 1972, after spending more than $5 million in vain to
cure these problems, the St. Louis Housing Authority, in a highly publicised event,
demolished three of the high-rise buildings. A year later, in concert with the U.S.
Department of Housing and Urban Development, it declared Pruitt-Igoe
unsalvageable and razed the remaining buildings.

It was only after the fact that sociologists were brought in to forensically explore
why the well-intentioned design principles failed. They found that; "a solution
appropriate for a particular group of people was provided for tenants with very
different sets of needs, values, and attitudes toward housing and the use of space."
(Lang et al., 1971: p.15) This is perhaps an outstanding example of how design,
independent of sampling or capturing the needs and requirements of potential and
actual users, can lead to unforeseen or even disastrous consequences, both in terms of
desirability and in terms of even safety.

I have already drawn attention to the fact that without the capturing of needs and
requirements, explicit and tacit, of those who will use, consume or live with a
product, designers or planners have no option but to rely upon their own logics
365
In The Language of Post-Modern Architecture (1987) Charles Jencks proclaims the death of high
modernism. In doing so, he is able to time that death to the moment - the cloudless July 15 in 1972
when the first three building of St. Louis's infamous Pruitt-Igoe housing complex were dynamited.
Indeed, Jencks is even able to provide the cause of death: the inability of this architectural style to
create liveable environments for the poor, in great part because the poor are not the nuanced and
sophisticated "readers" of architectural space the educated architects were. Jencks's argument in this
book is for a semiotic reading of architecture -a reading he believes is possible with post-modern
architecture, which is much more clearly referential than high modernism.

026463 452
regarding what they produce. But it remains that what can be discovered, what is
possible to discover, and what is necessary to be discovered, are quite different in
terms of their attainability.

Design Presumption

Jenny Preece in her book on usability offers a cartoon that poignantly suggests the
problems of design presumption. Two horses are piloting an aircraft and obviously
experiencing some difficulty in using the controls. One turns to the other and says,
"darn these controls, who designed them – racoons?" The suggestion is blatantly
clear. There is often deficit between design intentions and purposeful use. Another
example of presumptions are made in, and of design comes from Richard Dawkins
when he quotes the theologian William Paley at the beginning of the nineteenth
century. Paley provides an interesting example that captures the apprehension of the
layperson of a complex design. Paley noted that watches are very complex and
precise objects. If one were found on the ground, it would be difficult to believe that
such an object had been created by random chance. Instead the inclination would be
to suppose:
" . . . that the watch must have had a maker: that there must have existed, at
some time, and at some place or other, an artificer or artificers, who formed it
for the purpose which we find it actually to answer; who comprehended its
construction, and designed its use." (Paley, quoted in Dawkins, 1986: p.41)

Until the mid-nineteenth century, when science was still very much tied to religious
belief, the thinking regarding 'use' or 'purpose' extended to human biology. Leonardo
da Vinci's 'Proportional Study of Man in the Manner of Vitruvius' was drawn about
1487. Renaissance thinkers saw a kind of mathematical perfection in the human
form. This image depicts the human body within the ideal form of the circle and
within the perfect proportions of the square (see below).

026463 453
026463 454
Fig. 1.2 Leonardo Da Vinci's 'Proportional Study of Man in the Manner of Vitruvius' in Kemp
(1981: p.115)

Later, philosophers considered that man, cast in the image of God, was the
microcosm regarded "as a universe in miniature." (Godwin, 1979: p.68) 'Everything'
converged symbolically upon the human body (see below). It was considered that
since the internal organs possessed separate functions but nevertheless relied upon
each other for sustenance, then it was rational to consider mankind as an instance of
something that also must have some implicit purpose.

Fig1.3 Robert Fludd's (1617) 'Macrocosm and Microcosm' in Utriusque cosmi historia
(reprinted in Godwin, 1979: p.60)

But only God, who was viewed as residing at the top level in the hierarchy of order,
would know of this. The notion of use in Paley's example, applied to human beings,

026463 455
could only be a matter of divine providence. Dawkins’ example highlights something
of implicit nature of the concept of 'use'. In particular it draws attention to the
relevance of its application under certain conditions and in particular circumstances,
and its awkwardness in others.

But notions of use have changed. The advent of the enlightenment, empiricism, the
rise of positivism and the scientific measurement of human attributes came to be
reinforced by in the late 19th century by the emerging fields of physiology and
experimental psychology (Meister, 1999). They came to provide useful data that
could provide some index of generalisable human characteristics and capacities. This
was becoming more relevant with the growth in [standardised] machines used in
industrial settings which gave rise to issues of safety and productive efficiency
(machines having dials, levers etc. conveniently situated and easy for the human
hand to operate). It also had a very relevant place in the design of mass manufactured
products. It was becoming necessary to design products that will be appropriate for
multiple users, and that would cater for the largest percentage of the general
population.

A strong example here today is the car industry. This industry, with a strong tradition
in mechanisation and automation has carried out a great deal of research on the
dimensions of the human body, mainly because it is vital for comfort and safety
reasons for a car seat and dashboard controls to be correctly suited to the human
body. Most car seats are designed to be able to fit on 95% of men and all females.
The 95th percentile value for male seat breadth is 421mm. If the width of the seat
was only 362mm, the 50th percentile value for females, then 50% of women would
not fit on and 70% of men would not fit. The following table shows some of the data
that is available to car designers.

026463 456
Table 1.1 Anthropometric data adapted from MIRA (Motor Industry Research Association)
reports and they are representative of British Car Drivers.

The data presented here is adapted from MIRA (Motor


Industry Research Association) reports and they are
representative of British Car Drivers. The diagram on
the left explains the different dimensions. All the
figures are in millimetres.

Seat breadth is the distance between the hips when


seated

Males Percentile Value


Dimension Mean Std Dev 5 10 30 50 70 90 95
Stature 1738.1 68.00 1626 1655 1704 1737 1773 1824 1851
Buttock
610.7 29.16 563 574 596 611 626 647 659
knee
Knee height 563.4 28.19 520 529 548 563 578 600 611
Seat
375.2 26.08 336 344 361 373 387 410 421
breadth
Arm length 786.3 34.78 730 743 768 784 804 832 844
Females Percentile Value
Dimension Mean Std Dev 5 10 30 50 70 90 95
Stature 1624.5 56.01 1537 1553 1594 1623 1652 1699 1719
Buttock
600.9 26.89 560 567 586 601 616 636 646
knee
Knee height 540.6 26.73 499 507 526 540 553 575 588
Seat
364.0 26.92 325 333 349 362 376 399 412
breadth
Arm length 721.8 30.34 672 681 707 724 736 760 771

Today, the notion of 'use' applied within the context of sentient human beings
highlights particular problems of ontology and epistemology, particularly in the
fields of artificial intelligence and cognitive science. Finding generalisable human
traits, and using them to design artefacts and objects remains a double-edged sword.
The discussion offered at the end of the previous chapter cited the concerns of as
number of HCI commentators that the role and place of human in the design of
computers and computer systems has been somewhat compromised. It would seem
that the project of much of this kind of research is an attempt to get to a standard

026463 457
image of the human being - a scientifically designed version of the Leonardo's Man
in the Manner of Vitruvius - but maximise market relevancy and profit. To speak of
the 'use' of human beings is a politically charged issue which is suggestive of
emotive issues such as embryo research, slavery, or at the very best 'Taylorism' and
'Fordism' – treating the worker as 'cog' in the machine.366

But closely tied to the notion of the use of other human beings are the roles we play
in everyday existence. Erving Goffman's The Presentation of Self in Everyday Life
(1959) provided a detailed description and analysis of process and meaning in
mundane interaction. Interaction from this perspective is viewed as a "performance,"
shaped by environment and audience, constructed to provide others with
"impressions" that are consonant with the desired goals of the actor (p.17). The
performance exists regardless of the mental state of the individual, as persona is
often imputed to the individual in spite of his or her lack of faith in - or even
ignorance of - the performance.

However much we perform particular roles for other people, and for institutions,
'use' appears as clumsy applied within the context of the 'function' of human beings,
but it is also so within the context of familiar technologies. For instance we do not
commonly refer to our 'use' of television, but rather how we 'watch' or 'view' it - its
'role' within our lives. Abilities to somehow sample or capture such explicit or
implicit understandings of systems can be relevant for the purposes of informing
innovation, or for evaluating technology. For instance how does the concept of the
'use' of television (i.e. as a babysitter, friendly sound in an empty house etc.) compare
with the notion of 'viewing' it?

A recurring design presumption, either in a technical or social sense, would have that
individuals are either fixed or flexible and adaptable. They are fixed terms when it
comes to predicating how they will use or value something. Such as when users are
‘crystallised’ into the mean of their characteristics by the use of anthropometric, or
other ergonomic data. Alternatively they are also viewed as flexible or adaptable

366
Also interesting to note here the notion of the ‘abuse’ of children. While their may be legitimate
use of drugs, what is the ‘use’ of children?

026463 458
when it comes to matters of learning to cope with, or use idiosyncrasies of the
design. This has been the approach of statistically informed views of [massed-]
society and the economy, where those who lie outside the 'bell-curve' of normative
behaviour, size and ability are viewed at best, as anomalies not economically viable
to be catered for, or at worst, misfits and deviants. It is within this frame that
'universal design' (i.e. Keates and Clarkson, 1999) – design which strives to cater for
persons of all kinds of abilities – has developed.367

Interpretation and hermeneutics

I have suggested, citing actor-network theory, socio- technical constituencies and


Johnson's cultural circuit concept that ideas about technologies perpetuate between
interest groups, company functions, manufacturers, standards committees,
distributors and many others. As myth or fact, ideas impact, shape and culminate in
a final product offered to end consumer-users.

The products themselves, their characteristics, attributes, features and functionalities,


combine with the discourse accompanying their passage into the marketplace via
marketers, distributors, agents, retailers - situated individuals in various
environments - social, physical, geographical and cognitive - who confirm, dispute or
verify the value of the product and any accompanying myths. 368 Each of these help
propagate the culture of production.

I have already suggested that the physical product or the service outcome straddles
the realms of 'cultures of production' and 'cultures of use'. But they can only do this
through consumer-users carrying out the acts of appropriating, consuming and using

367
A number of different terms have been used to describe the goal of non-exclusive design. These
include: Design for Disability; Universal Design; Transgenerational Design; Design for All; Design
for a Broader Average, and other terms (Hewer, 1996)
368
According to Donald (1991) the most significant achievement made possible by the use of
language was ‘mythic invention’. Exploiting the fundamental narrative organisation of oral language
(such as suggested by Bruner, 1986), language-using cultures began to construct overarching myths in
order to explain human existence and its relation to the non-human world. As Donald argues: "Myth is
the prototypical, fundamental, integrative mind tool. It tries to integrate a variety of events in a
temporal and causal framework. It is inherently a modelling device, whose primary level of
representation is thematic." And on this basis he concludes: "modern humans developed language in
response to pressure to improve their conceptual apparatus, not vice versa." (1991: p.215)

026463 459
the product, all be it in different ways, and even for different ends. These ends also
come to shape the myths and realities of what is finally produced in the marketplace.
The success of some products and the failure of others suggest that there is perpetual
tension upsetting any notion of symmetry and balance between the visions and
realisations of these two cultures. This process can be understood to be a process of
communication, a form of communication between producer-designers and
consumer-users. For instance in manufacturing such as quality function deployment
(QFD) practitioners speak, using linguistic metaphors, of parsing the 'voice of the
customer' with the 'ear' of the engineer (Hauser, 1988; Hauser, 1993; Hauser and
Griffin, 1993).

Based on anthropological, neurological and linguistic evidence Donald (1991)


suggests four stages in the historical development of human cognition (all of which
remain still active): episodic (case-based), mimetic (tacit, gestural), mythic
(linguistic) and modem (based on extended memory: pictures, writing, computers).
The invention and refinement of spoken language must have brought about a radical
shift from the cultures preceding that of Homo sapiens. Speech added a new and
more powerful mode of interpersonal interaction, utilising a representational system
with greater precision and comprehensiveness of reference to objects and actions and
their location in space and time. It also provided means for reflectively connecting
events through relationships of purpose, reason and causality and so for the
development of narrative meaning making (Clarke, 1992). This eventually
crystallised into writing and other forms of depictions.

Whereas written texts date back some 5,500 years, the earliest example of a mass
produced text is understood to be the printing of the Mazarin Bible using movable
type designed by Gutenberg in 1456. Printing as a technique diffused quickly across
Europe and by 1500 some thirty thousand different titles had been produced (Black
and Bryant, 1995). The wider availability of books engendered and enabled wider
literacy, but also the potential for differing interpretations of the same text. This was
particularly pertinent in relation to religious texts where individuals and groups could
sequester themselves with the Gospel and develop interpretations that were at

026463 460
variance with church dogma. Black and Bryant suggest this as a key reason why the
Christian Church began to fragment into different denominations. Not only does this
serve as an example of the potency of media and media technologies in influencing
social institutions and human affairs, but it also suggests that similar or identical
objects – in this case text – can have varying meanings for individuals and certain
groups: "texts are always open to multiple interpretations." (Thomson, Locander and
Pollio, 1989: p.147)

Hermeneutics is a practice whose roots lie with the interpretation of sacred texts. The
aim was the elucidation of divine meaning through close reading. It came to the
recognition that individual textual components have to be dealt with as part of a
larger whole. Its practice became more widespread moving beyond the confines of
the church, where it was believed that in principle it was possible to determine an
'objective' immutable meaning in a text, either as intended by the author or as
contained in the text itself. This was termed 'hermeneutical theory'. This relates to
technical products and Woolgar (1996) sees that products are created intentionally to
perform certain functions and to fulfil certain needs of certain groups of people. He
notes that increasingly, considerable effort is employed on the part of designers and
marketers to employ appropriate research to fulfil such intentions.

Betti (1980) believes the text as a process of the author making his mind objective.
The 'task' of the reader or listener is to re-experience, re-cognise, and re-think
(Verstehen) what the author originally felt or thought (Bleicher, 1980: pp.110-112).
Betti viewed that misunderstanding occurs with increases in space and time between
the author and reader. Philosophical hermeneutics took the position that
interpretations are not decidable. In other words understanding is not the objective
re-cognition of an author's intended meaning, but instead it is a practical task in
which the interpreter is changed by becoming aware of new possibilities of what it is
to be a human being. Phenomenological hermeneutics is based largely on the work of
Paul Ricoeur and mediates between a recapture of an objective sense of the text and
an existential appropriation of its meaning into understanding. Semiotic-structural

026463 461
analysis acts to show how the text works and what it says before the sense of it is
used to give insights into the interpreter's own situation.

Pre-understanding

A key concept emphasised in hermeneutics is [pre-] understanding. This follows


from the recognition that:
" . . .prior to any interpretation, we and the object of our interpretation exist.
In advance of any reflection, we belong to a cultural world. The implication
of being-in-the-world is that the interpreter and that which is interpreted are
linked by a context of tradition – the accumulation of beliefs, theories, codes,
metaphors, events, practices, institutions, and ideologies (as apprehended
through language) that precede the interpretation. While taken for granted,
ordinarily unnoticed, and never made fully explicit, tradition nonetheless
weaves together the interpreter and that which is interpreted. " (Arnold and
Fischer, 1994: pp.56-57)

Here, Arnold and Fischer raise a theme that should be familiar in the discussion
offered here so far. Specifically, this is the interrelation and interaction between
cultural contexts, objects and situated individuals. This chapter considers that
interpretations of technologies can gain from the adoption of a hermeneutical
position in their analysis. Such a position has been advocated by a number of
commentators, perhaps most obviously by Steve Woolgar (1991), who suggests that
there is research value in treating technology as 'text':
"When construed as a text, technology is to be understood as a manufactured
entity, designed and produced within a particular social and organizational
context. Significantly, this is often done with particular readers or sets of
possible readers in mind - it is fabricated with the intention that it should be
used in particular ways. On the consumption side, the technology is taken and
used in contexts other than, and broadly separate from, its production." (p.92)

There was also the application of theories drawn from the fields of cultural, media
and communications studies – most notably Stuart Hall's (1980) encoding/decoding
model - which has been extrapolated into the realm of technology innovation and
diffusion studies (MacKay and Gillespie, 1992). The aim here was expansion of the
social shaping models advanced by social contructivists such as MacKenzie and
Wajcman (1995). While these social-shaping models tended to focus on the
production of technology, as well as macro level sociological processes, it was felt

026463 462
that they neglected the mechanisms of apprehension, appropriation and consumption
– that is, the individual responses to technology.

Also contributing to micro-level sociological analysis is the work of Roger


Silverstone and Leslie Haddon (such as in Silverstone and Haddon, 1996). They
argue, following the cultural studies perspectives offered by earlier cultural
researchers such as Johnson (1986), for a view of innovation as a social and cultural
activity; every bit as much as a political and economic activity.

Each context, itself a product of a realised or tacitly understood constituency - may


either promote or abate the innovation and diffusion of new products, and so shapes
technology development as well as the circumstances and situations leading to use.
Not least of these is technology itself, and its path-dependencies, certain forces that
constrain, if not the cognitive aspects of shifting to new frames of thought, certainly
the economic and other defining aspects of innovation. This again picks up Molina's
argument in favour of sociotechnical constituencies over actor-network approaches
to the study of technological development. To ignore the 'technical terrain' what is
available and pre-existing in terms of the state of the technology and its legacy in
standards etc. is to attempt to start from a clean slate with every study. Other pre-
existing influences include the competencies of the firm, its accumulated knowledge,
and the economic climate.

From the field of the management of technology, Fleck's (1988, 1993) notion of
innofusion (a blend of innovation and diffusion) suggests that innovation of
particular technologies (his example was industrial robots) can only be fully realised,
through their diffusion into sites, existing expertise, and process of implementation.
It suggests that workplace contingencies are idiosyncratic, composed of individual
and diverse influences stemming from accumulations of formal and informal levels
of knowledge, ways of doing things, and other technologies. This suggests the need
for plasticity in the functional design of technologies, and concerted innovation effort
on behalf of users and management.

026463 463
The advent of sophisticated graphical interfaces and the wider diffusion of PCs
contributed towards a massive resurgence in user-centred design methods in the
1980s. Carroll (1997) notes that by 1990 there was a clear consensus that the
cognitive modelling approach had failed to provide a comprehensive paradigm. An
'ethnographic turn' was arising in human computer interaction studies, overturning
the laboratory bias that was inherited from HCI methods developed from cognitive
science and experimental psychology (which I have already touched upon earlier,
and will do so in more depth in the following chapter). In an early paper which drew
attention to context with respect to usability, Beven et al. offered a model of context
which addressed all those immediate dimensions which can influence the experience
of usability:
“Ease of use determines whether a product can be used, and acceptability
whether it will be used, and how it will be used. Ease of use in a particular
context is determined by the product attributes, and is measured by user
performance and satisfaction. The context consists of the user, task and
physical and social environment.”

One particular approach, which precipitated out of this epistemological change or


paradigm shift, was contextual inquiry (Raven and Wixon, 1994), and contextual
design (Wixon, Holztblatt, and Knox, 1990; Holtzblatt & Beyer, 1997). Contextual
Inquiry (CI) techniques were first adapted from ethnographic research methods to fit
the time and resource constraints of engineering environments (Holtzblatt & Beyer,
1997). CI presumes idiosyncratic contingencies arising from implementation, and
aims to collect data on users, knowledge and work practices in the actual
environment of use.

026463 464
Determinants of usability (from Bevan et al, 1991)369

Davis' Technology Acceptance Model (TAM) (i.e. Davis, 1986; 1989; 1993 and
Davis et al. 1989) is a further example of an attempt to unpack the distinctive
relations between perceptions of a technologies use and utility, compared to its
actualised use and utility. The notion of perceived usefulness is defined as the degree
to which a person believes that using a particular system would enhance his or her
job performance. Perceived ease of use (the degree to which a person believes that
using a particular system would be free of effort) (1993: p.320).

A common shortcoming of all these studies, as previously suggested, is that many of


these fields of research (Fleck, Davis, Contextual Inquiry) have tended to focus on
machines for office and industrial environments, and whose underlying ethos is
relatively easy to comprehend - improving efficiency, productivity or the
improvement of safety. It is a fact that business has so far been the major site for
computer developments, with well-established IT budgets aimed at continuously
adopting new information technologies to remain competitive, raise productivity, and
improve decision making (Alavi & Joachimsthaler, 1992; Straub & Wetherbe, 1989).
Even when Davis' speaks of perceived usefulness, it hinges upon: "the degree to
which a person believes that using a particular system would enhance his or her job
performance." (Davis, 1989: p. 320) Economic concerns, then predominate over
those relating to more diffuse, nevertheless familiar, concepts such as
‘entertainment’.

I have already gone some way to highlighting the unique properties of television as a
technology, and of the home as a space of use and consumption. Purposeful activity
in the home can be quite different from that of the workplace.
"The experiential fact that people voluntarily accept considerable
inconveniences to drive the car of their dreams, live with the furniture that
they like, or wear clothes for which they are admired, suggests that other then
369
Bevan N, Kirakowskib, J. and Maissela, J. (1991) What is Usability? Proceedings of the 4th
International Conference on HCI, Stuttgart, September

026463 465
technical criteria dominate everyday life and individual well-being."
(Krippendorf, 1995: p.157)

And so it needs a different scope of sensitivities when approaching the capture of


relevant data for informing design or for its evaluation. More general consumer
research (e.g. Miller, 1995) whose focus has always been on a wide variety of
products perhaps lends a more relevant frame for the study of technologies which are
aimed for individual use. Even Donald Norman, concedes that usability may pall
against other culturally defined aspects of products:
"In the consumer economy taste is not the criterion in the marketing of
expensive foods or drinks, usability is not the primary criterion in the
marketing of home and office appliances. We are surrounded with objects of
desire, not objects of use." (Norman, 1988: p.216)

Researchers such as Erdem and Swait (1998) have argued that the equity endowed by
consumers to a brand arises from the value the brand provides them in terms of
reduced search costs and decision simplification. There is the suggestion that brands
and the particular features that differentiate individual products will mark out their
place in the market place, and ultimately consumer consciousness. However, this
focus on brand and features tends to obfuscate the phenomenological aspects of the
product that locate it within an individual's everyday life, that is domesticate it.
Marketing focuses mainly upon the point of consumption, and design informed by
such thinking concentrates also on point of sale - i.e. designing a consumer electronic
so that it will stand out - style wise- on the retail display amongst others.

Earlier I referred to the inadequacy of the term 'task' when applied in the instance of
interactive as well as traditional forms of television. It is better to substitute 'tasks' in
these cases with 'experience' or 'meaning'. The main aim of the CU approach is to
provide a frame for mapping interpretations of a given product or service when it is
considered as an experiential whole, i.e. as circumstance and activity (use), a quality
(usability), a practice (usage), and as a value (usefulness). Meanings are captured in
language about CAFFs Nelson and Gruendel (l979:2-3) draw a useful distinction
between “a person’s verbal or enactive reconstruction of a common event and the
conceptual representation underlying this construction.”

026463 466
These elements are viewed as culminating in the institution of a use process which
occurs when technologies are apprehended, appropriated, appreciated and used over
their lifecycle and within the lives and lifestyles of consumer-users. As such they
bear direct relevance to attempts to unpack user-consumers holistic impressions of a
technology. Used as a component of a larger macro level mapping process, such as
sociotechnical constituencies it can suggest a channelling of use knowledge relevant
to various parts of a larger sociotechnical constituency.

I have already outlined throughout the previous chapters how various commentators
have noted that language is developed through social processes and communication
is a social act. And that the role of language plays in constituting human
interpretation of situations, and even how one designs themselves through the use of
language (cf. earlier references by Berger and Luckman, 1966; Winograd and Flores,
1988; Schein, 1985; Mohan, 1993; Arnold and Fischer, 1994). Following the notion
that a person’s reality or experience is conveyed in language, and in research
processes most often documented and conveyed in text, it seems important now to
draw upon work which has sought to identify the core nature of communication and
text. This will then help to understand more fully the relation involved in the concept
of ‘technology as text’ (Woolgar, 1991).

Text

The etymological basis for 'text' comes from the Latin textre - to weave. Text
commonly refers to "written or printed words and form of a literary work." 370 Within
the social sciences, and under the influences of semiotics, texts and their 'reading'
now refers to the interpretation of many diverse forms of phenomena and activities
such as performance-based text, literary journalism, and narratives of the self
amongst other phenomena (Denzin, 1997). The basis for this argument is that
language, representation, description and interpretation is pivotal to building our
explicit and private realties. Spradley (1979) for instance suggests that much of
culture is encoded in linguistic form. And as Ricoeur (1977) noted, we begin to
370
Longman Dictionary.

026463 467
unpack the meaningfulness of lived experiences by presupposing that it is as
purposeful as a written text. Like a literary text, a social action; "constitutes a
delineated pattern which has to be interpreted according to its inner connections."
(p.322) It means that anything known to a person may be viewed as constituting a
text, as any public knowledge of it must be rendered and represented in either words
or actions. Indeed, Daniel Dennet (1987) called for a general theory of representation
when he wrote that what is needed is nothing less than a completely general theory of
representation. This would explain how words, thoughts, thinkers, pictures,
computers, animals, sentences, mechanisms, states, functions, nerve impulses and
formal models can be said to represent one thing or another.

Considered as textual phenomena, technologies could be said to be 'inscribed' with


features, functions and attributes through which they are distinguished and
characterised, read by those who use, sell, design and develop them. As Stephen
Talbott (1997) suggests that every technology already embodies certain human
choices. It expresses meanings and intentions. He suggests that a gun was designed
to kill living organisms at a distance, which gives it an "essentially" different nature
from a pair of binoculars.

This is akin to what Joseph Weizenbaum pointed out when he suggests that there can
be no "general-purpose" tools (Weisenbaum, 1976: p.37) They accrue throughout the
processes of ideation, design, distribution, appropriator, consumption, use, and
disposal. As text is to the written book, these characteristics are the crystallised,
purposeful aspects of a given technology. For a technology, such as the Cambridge
System, to be effective: ". . . there must be a fit between technology and task and
between individual characteristics and the technology." (Hubona and Geitz, 1997) In
order to be relevant, s ucc essful or 'fit' with an anticipated 'audience' or environment
the benefits of these characteristics need to be transmitted to, and understood by,
those who will pay for them. But "all understanding is a product, in part, of the
interpreter's [pre-]understanding." (Arnold and Fischer, 1994: p.57)

Research work in cognitive psychology and psycholinguistics has emphasised the

026463 468
creative activity of the reader. Cognitive psychologists explain the interpretative act
of reading in terms of 'schema theory'. The notion of a 'schema' (plural 'schemata' or
'schemas') derives from the work of the British psychologist Sir Frederic Bartlett who
in his classic work Remembering (1932), defined it as "an active organization of past
reactions, or of past experiences." (Bartlett, 1932: p.54) Bartlett explained memory
as a creative process of reconstruction making use of such schemas. According to
contemporary schema theory, perception, comprehension, interpretation and memory
are mediated by mental schemata - hierarchical structures (or 'frames') for organising
knowledge. Hollenbeck and Slaby (1979); put this strongly when they speak of the
way in which children learn to 'decode' experiences drawn from the televisual
environment.
"Children are growing up in an environment in which they must learn to
organize experiences and emotional responses not only in relationship to the
physical and social environment of the home, but also in relation to the
omnipresent screen on which miniature people and animals talk, sing, dance
and encourage the purchase of toys, candies and breakfast foods. . . [children]
must learn not only to decode the verbal utterances of parents and friends or
to establish schemata for the meaning of the smiles and frowns of adults
around them, but they must also learn the special conventions of the
television medium, its smaller than life frame, its appearances and
disappearances of characters, intrusions of irrelevant commercials to
otherwise engrossing story material, the meanings of zooms, fade-outs,
miraculous superhero leaps, and flashbacks." (p.226)

Cognitive theorists place emphasis on the fact that for each new environment a
subject encounters, they bring pre-established schemata based on previous
experience and fantasised anticipations about what may be expected in a situation.
Schemata have been built up over repeated interaction with a given environment as
well as reflections upon the experiences of the interactions. Some schemata are more
complex, more integrated or organised and differentiated than others are. Plans or
anticipatory schemata are not only specific to situations, but involve a search and
selection faculty, related to the kind of information to be processed or the kind of
social setting one anticipates.

In an obvious act of constructing meaning Halasz (1988) sees that reading texts is a
process in which the reader interprets information based on his/her pre-existing

026463 469
biases, expectations, and perceptions. Even the most mundane texts require the
reader to go beyond that which is explicitly stated in order to make sense of them,
though we are normally unaware of the extent of such interpretation in our everyday
reading. Readers draw upon different repertoires of schemata, partly as a result of
relatively enduring differences in background (i.e. experience and knowledge) and of
relatively transitory differences in viewpoint (i.e. purposes). For experienced readers
reading is a continual process of making inferences, evaluating the validity and
significance of texts, relating them to prior experience, knowledge and viewpoint,
and considering implications. Such psychological accounts do not suggest that a text
means whatever a reader wants it to mean, but simply that readers must make active
use of schemata to make sense of the text, and that different readers may employ
different schemata and may vary in their interpretations. Reading is not passive
'information retrieval' and a text does not have a single, unchanging meaning.

Uses and gratifications

'Uses and gratifications research' (see McQuail and Windahl, 1993; Morley, 1992 or
Curran, 1990) was an approach in media and communication studies which derived
from functionalism. It builds from the idea that individuals are motivated to use
media in various ways to meet particular needs, and from the assumption that
individuals take an active role in the communication process and are essentially goal
directed in their media behaviour. In other words they have specific 'uses' for media,
gratified by consumption of certain content. It represented a fundamental shift from
considering what media does to people, to what people do with media. (Halloran,
1970, McLeod et al., 1991) Moreover, it suggests 'active' interpretation rather than
'passive' reception on behalf of audiences.

The crystallised purpose of a piece of written material, or the features and functions
of a given technology "transgress all closure" as Johnson (1990: p.40) has it. They
are always, as all mass produced products, open to some degree of re- or mis –
interpretation regarding their 'use' or 'values'. What is interesting here is that
'interpretive flexibility' extends past the processes of development and the 'closure' of
the final manufactured design. This challenges the position of some social

026463 470
constructivist commentators such as Bijker (1995). For him, product interpretation is
solely a matter of identifying those (particularly interest groups) who shape the
development of a technology towards 'closure' – i.e. the manufacture and production.
There is little discussion of consumer-users' receptivity of the products, nor of their
attempts to self-customise lacking designs. The truth is that regardless of the
processes and confluence of development and production, no matter how many
social groups influence its design (or come to that how many usability tests are
conducted), the product may never come to be relevant or even registered in the
consciousness of consumer-users at all:
"The making of a film is not something to be discovered purely in the text
itself, but is constituted in the interaction between the text and its users . . .
The early claim of semiology to be able to account for a text's functioning
through an immanent analysis was essentially misfounded in its failure to
perceive that any textual system could only have meaning in relation to codes
not purely textual, and that the recognition, distribution, and activation of
these would vary socially and historically." (Hill, 1979: p.122; quoted in
Morley, 1992: p.87)

Such a realisation was typical of the rising discontent in the 1970s with the uses and
gratifications model (Elliot, 1974). Uses and gratifications, while emphasising the
active reader of texts, presumed an overly-optimistic view of people being
empowered to successfully find texts that fulfilled their purposes. It denied the fact
that texts have 'preferred meanings' imputed or inscribed into them by their creators,
instead it suggested that "no single 'correct' meaning can be conveyed by language
and transmitted to all readers alike." (Stern, 1998: p.11)

Such thinking serves as the basis of Hall's (1980) encoding/decoding model of media
reception. Encoding/decoding theory emphasises the interactive qualities that take
place between a text and its users. It was based on the realisation that mass
communication is a structured activity, in which the institutions that produce media
messages do have a power to set agendas, and to define issues. Rather than allowing
a totally free range of interpretation possibilities on behalf of the media consumer (as
does the uses and gratifications approach), it sees these possibilities tempered by the
limitations in agenda setting and cultural categorisation set by the broadcasters.

026463 471
The extent to which the reader is involved in constructing meaning depends partly on
the kind of text involved. Indeed, considering the degree by which things are 'read', it
seems that different types of 'text' are even capable, and even require quite different
levels, styles and depths of 'reading'. Some are more 'open' than others. For instance,
one would usually expect more active interpretation by the reader to be involved with
a poem than with a telephone directory. David Olson has argued that in formal
scientific and philosophical writing 'the meaning is in the text' rather than in its
interpretation (Olson, 1977: p. 277), but (whilst some may indeed see this as a goal),
textual meanings can never be severed from interpretation. In his widely-acclaimed
book S/Z, Roland Barthes (1974) referred to two kinds of writing in terms of the
extent to which they involve the reader - the 'readerly' (low reader involvement,
lisible, 'consumer-focused' 'user-centred') and the 'writerly' (high reader involvement,
scriptible, designer-focused – a strong example being 'designer objects', wonderful
aesthetic design, but highly unusable). Richard Johnson (1986) is not opposed to
formalist textual analysis methods residues in cultural thought as some other more
postmodern thinkers. Though he does claim the form alone does not determine
function. He notes that I understand formalism negatively, not as abstraction of
forms from texts, but as the abstraction of texts from the other moments" (p. 62). The
approach which Johnson terms Advanced Semiology has led to notion of narrative
creating a stance or subject position towards them in the reader (narratives or images
always imply or construct a position or positions from which they are to be read or
viewed (p.66)). However he argues that such subject positions are generated only be
formal features (such a "readable" or "writeable" in Barthes's thought) but also by the
somewhat unforeseeable idiosyncratic circumstances of their consumption. He notes
that "the text-as-produced is a different object from the text as read" (p.58)

Texts of the 'readerly' kind leave the reader 'with no more than the poor freedom
either to accept or reject the text' (cited in Hawkes, 1977: p. 114) - they treat the
writer as producer and the reader as submissive consumer and suggest their
'reflection' of 'the real world'. This is similar to what Roth (1987: p.47) suggests as
the "ways writers actively construct their audiences," creating them as they compose.
Texts of the writerly kind invite the active participation of the reader, and also, in

026463 472
their attention to linguistic mediation, an involvement in the construction of reality.
Ironically, it is readerly texts which tend to be described as 'readable', whilst writerly
texts are often referred to as 'unreadable' because they require more effort. In
passing, it is worth noting that the extension of Barthes's notion to other media could
be productive, involving a consideration of the extent to which engagement with
such media might be regarded as 'userly' or 'designerly'.

I may read a book for the purposes of instruction; I may read a book for
entertainment and enjoyment. But do I read a 'technology' considered as a text in a
similar way? Such a view may comprise of a description of not only the ostensible
casing, buttons and graphical characteristics, but also the meanings which
accompany, adorn and characterise it in the public consciousness. Television for
instance has its 'double articulation' as a conveyer of messages and as a technological
artefact (Williams, 1974; Silverstone, 1994). It may also represent the aspirations and
identity of a particular culture (such as the BBC as a hallmark of British society), be
a 'friendly' voice when one lacks company, be a childminder (the 'one-eyed
babysitter'), and so forth. It provides then a range of tangible, or explicit messages,
bound to a plethora of implicit, tacit meanings and uses.

Likewise other kinds of product and services often include visual cues and attributes
such as those seen in advertising, which indicate something of the 'ideal' situation
and circumstance under which the product is used, or associations and attributes
advertisers see match or position the product (i.e. Barthes, 1972). And again
published specifications and brochures and other 'grey literature' provide a quite
different message. There are also the myths surrounding the product in a design
department, recommendations of retail staff and friends and so forth. In other words,
each product as a whole in the mind of the individual, is built from both intangible
and tangible elements, and the explicit and implicit means by which it comes to be
registered in consciousness.

Returning to readers and texts, the degree of a reader's involvement depends not only
on the type of text and on how readerly or writerly it may be, but on how the text is

026463 473
used. Poetry is sometimes 'consulted' for biographical information and telephone
directories have occasionally been used as sources of 'found poetry'. At least with
experienced readers, how a text is used is almost entirely up to the reader. Certainly,
the reader's purposes are at least as important as the author's intentions. Whilst
Swift's Gulliver's Travels may have been primarily intended as a satire, this does not
stop children enjoying it purely as entertainment. MacKay (1995) notes this in
relation to technology;
"Most technologies never stabilise in the way so many sociology of
technology accounts suggest . . . people are not merely malleable
subjects who submit to the dictates of a technology; in their
consumption they are not the passive dupes suggested by crude
theorists of consumption, but active, creative and expressive subjects .
. . they may refine a technology in a way that defies its original,
designed and intended purpose."(p.44)

Very much like a media 'message', a technology cannot be entirely separated from

the intentionality of the designer. Their intention is crystallised within the product by

virtue of its features and functions that close certain options while raising others.

With the advent of computer technology within television production processes,

there is a much wider palette of 'effects' that can add to or augment the experience of

televisual texts. Such innovations not only transformed the creative process involved

in the use of production techniques but also intensified what is presented to the

viewer, and to some extent has developed and, in due course, extended the

interpretative prowess of the viewer. In 1900 Porter was hired by the Edison

Company to make improvements to and redesign their motion-picture equipment,

and he was soon placed in charge of Edison’s skylight studio on East 21st Street in

New York City. For the next few years he served as director-cameraman for much of

Edison’s output, starting with simple one-shot films (Kansas Saloon Smashers

[1901]) and progressing rapidly to films with special effects (The Finish of Bridget

McKeen [1901]) and short multiscene narratives based on political cartoons and

contemporary events (Sampson-Schley Controversy [1901] and Execution of

026463 474
Czolgosz, with Panorama of Auburn Prison [1901]). Porter also filmed the

extraordinary Pan-American Exposition by Night (1901), which used time-lapse

photography to produce a circular panorama of the exposition’s electrical

illumination, and the 10-scene Jack and the Beanstalk (1902), a narrative that

simulates the sequencing of magic lantern slides to achieve a logical, if elliptical,

spatial continuity.

A revolution in filmmaking

It was probably Porter’s experience as a projectionist at the Eden Musée that

ultimately led him in the early 1900s to the practice of continuity editing. The

process of selecting one-shot films and arranging them into a 15-minute program for

screen presentation was very much like that of constructing a single film out of a

series of separate shots. Porter, by his own admission, was also influenced by other

filmmakers—especially Georges Méliès, whose Le Voyage dans la lune (A Trip to

the Moon [1902]) he came to know well in the process of duplicating it for illegal

distribution by Edison in October 1902. Years later Porter claimed that the Méliès

film had given him the notion of “telling a story in continuity form,” which resulted

in The Life of an American Fireman (six minutes, produced in late 1902 and released

in January 1903). This film, which was also influenced by James Williamson’s Fire!

(1901), combined archival footage with staged scenes to create a nine-shot narrative

of a dramatic rescue from a burning building.371

Morley (1992) explicates the operational premises on which Hall's


encoding/decoding approach is based. Morley surmises these as:
 The same event can be encoded in more than one way.
 The message always contains more than one potential 'reading'.
 Understanding the message is also a problematic practice, however
transparent and 'natural' it may seem.

Hall argues that there is a basic distinction to be made between the social processes
371
http://www.britannica.com/biography/Edwin-S-Porter

026463 475
that encode and decode media texts. Cultural forms may be said to be encoded
through a historical mix of institutional relations, professional norms and technical
equipment. The audience, on the other hand, decodes using similar social structural
relations, political and cultural dispositions as well as access to the relevant
technology. At the centre of the encoding/decoding model is three decoding
'potentials' - dominant (or preferred), negotiated and oppositional - which suggest
the logical possibilities of how the receiver shares, partly shares, or does not share
the code in which the message is sent and therefore to that extent, being likely to
make a dominant, negotiated or oppositional decoding.

In many senses Hall's theory may be re-interpreted substituting the production,


transmission, reception, and interpretation of media programmes with the production,
diffusion, appropriation, and consumption/use of a technology (such as suggested by
Mackay, 1995). For instance Hall makes the point that there may be no direct fit
between the encoding and decoding ends of the communication chain. This is
directly analogous to the way in which technologies are produced which lack
usability, or indeed lack utility, particularly within the lifestyle and expertise of
certain groups of consumer-users over others. I have already presented examples of
this earlier with respect to architectural design, but an example from product
development arises in Ortt and Schoormas (1993) when they note the lack of fit
which is often experienced between production of product attributes, and their
interpretation by consumers;
"In the ideal situation, physical characteristics are correctly perceived by
consumers. Furthermore, it is assumed that all consumers can interpret the
products' attributes and consequences the same way. However in practice
consumers perceive products incorrectly. They often do not see attributes that
are present in a product, and sometime they even see attributes that are not
present at all." (p.378)

The encoding/decoding model offers a basic theoretical framework which may be


used to investigate instances where certain technologies and their purposes integrate
easily within the lives of its consumer-users (what Silverstone, 1995 refers to as
becoming 'domesticated'). One can say this is akin to the 'preferred reading' of the
technology and its CAFFs – characteristics, attributes, feature and functions.

026463 476
'Negotiated readings' may be where technologies or particular CAFFs cause
problems, or only partially integrate within consumer-users' lives. Perhaps it needs
constant attention, upgrading, or perhaps is of value for only a certain period of time
(such as records, films, or games which have a limited interest period of intense use
followed by a loss of interest and little or cessation of use).

Negotiated readings, can also cast light on consumer-user adaptations with existing
technologies - the way in which consumer-users may 'subvert' or innovate on the
characteristics and functions of a produced technology (Westrum, 1991). Essentially
this is where the consumer-user does not use them in the product designer's
prescribed manner or 'designed' intention.372 'Oppositional readings' of a technology is
where a person remains unaware or positively resists use, or particular characteristics
and functionalities. Examples here are parents who intentionally avoid buying a
games console due to beliefs regarding its influence on children's behaviour, or those
who prefer not to read a manual and therefore miss out on more esoteric nevertheless
useful functions, which are difficult to find through 'learning by using'.

Context

What is quite clear from the discussion so far is that texts, whether considered as
'messages' or as 'features and functions' rely on the large part on their contexts of use
or interpretation. The etymological basis for context derives from the Latin cum
(with) and textre (to weave) suggesting 'a weaving together'. Following the previous
chapter, Fritjof Capra, (1996) suggests systems relate strongly to the notion of
context:
"… systems thinking is contextual thinking; and since explaining
systems in terms of their context means explaining them in terms of

372
“ . . . an innovation is socially constructed along multiple dimensions which can be emphasised or
de-emphasized by different users in different situations. In other words, different users will perceive
and construct technologies that are nominally the same in different ways. There are few predetermined
parameters to the technology but, rather, the design of the technology depends crucially on the social
and psychological construction of the adopting organization. Thus, innovation is not a fixed entity but
a malleable social construction. The degree to which innovation is successful in a given context is
then seen as being determined by both the technology itself (its unique configuration) and the
operational and organizational context into which it is being inserted. Given this, decisions made
before implementation, about the design of the technology and organization, will be crucial to the
outcome of implementation.” (Newell, Swan, and Clark, 1993: p.34)

026463 477
their environment, we can also say that all systems thinking is
environmental thinking." (Capra, 1996: p.37)

Typically, we think of contexts as frameworks or circumstances that enable us to


infer or determine intention and meaning.
"in semiotic terms, a text represents a coherent cluster of signifiers. A
text signifies something when it becomes situated in a context for
interpretations." (Lindlof, 1995: p.53)

No information is realised without context. Gregory Bateson (1980: p.24) is very


direct when he defines the importance of context to meaning, behaviour and
evolution;
" . . . 'context' is linked to another undefined notion called 'meaning'.
Without context, words and actions have no meaning at all. This is
true not only of human communications in words but also of all
communication whatsoever, of all mental process, of all mind,
including that which tells the sea anemone how to grow and the
amoeba what he should do next . . . I am asserting that whatever the
word context means, it is an appropriate word, the necessary word, in
the description of all these distantly related processes."

Returning to a biological analogy for a moment. Any given ecosystem, consisting of


a number of species and a certain physical environment, will exhibit a great deal of
mutual adaptation as the various species have co-evolved while constantly having
effects on their surroundings. Over time, the "design" of each species becomes
increasingly interlocked with the rest of the ecosystem, so that its structure becomes
well-adapted to particular forms of interaction while contributing certain ongoing
influences in turn. In this sense, the structures of the various species and their
environments become "coupled" - implying one another through their mutual
adaptation and their roles in creating the conditions of continued existence for one
another. As a result, it makes little sense to study an organism in isolation from its
environment. Simple descriptive anatomy might provide a useful source of reference
material, but it will not provide concepts to explain how the organism functions in its
natural surroundings or why it is structured as it is (Buller, 1999).

When one provides information, such as a written text or a set of functions or


features, context provides the basis by which that information converts into

026463 478
knowledge or know-how (expertise). Text, functions and features crystallise through
design and production. Contexts guide use and usage styles and behaviours (as
suggested in chapter 1, on the structuralisation of interactive experiences). But
contexts are mutable, dynamic forces rather than static fixed terms or conditions
within which a person designs, or a firm produces or what a text comes to represent
to someone (as are some media texts such as televisual material). As active acts of
interpretation, they influence and shape; and are active whether they are historical or
contemporary. But as previously discussed texts, features and functions also shape -
and even transform - and do not simply represent. The philosophy of Wittgenstein
(1953) for instance, suggests how language is totally dependent on use. John Seely
Brown and Paul Duguid, in their book, The Social Life of Information (2000), put
forward that writers and designers, "always face the challenge of what to leave to
context, what to information." (p.200)

On a social level, an instance when the social, the technical and time are woven
together is when contracts, and other mechanisms designed to crystallise social
relations, are drawn. Regulatory structures and standards for instance, can act as an
important cornerstone to functional relations between parties. However, through
changing contexts, such as new innovation coming to the market and new
competition, they can also become a bind, a hindrance, or act as a factor stifling
innovation. Also, if the contexts written into, and described in the contract are not
comprehensive and detailed enough, loopholes may appear benefiting one party over
another.

Fish (1980) agrees for the fundamental importance of readers' interpretations of texts.
But for him, a text is not a text without a reader and a context. He stressed meaning-
making as a process, not as the 'extraction' of 'content', but he limited the possible
range of readers' meanings by stressing the importance of 'interpretative
communities'. This idea suggests that realities, readings, and meanings take place,
and develop value, within social circumstances: "although reality is . . .socially
constructed, it can then be objectified in constraining social structure." (Arnold and
Fischer, 1994: p.59)

026463 479
Different cultures, or 'interpretative communities' possess variations in what they
consider amusing, as well as under which contexts and conditions things 'become' or
are 'seen to be' amusing. "Through dialogue, the community collectively creates new
understanding." (Ibid: p.57) McQuail (1994) sees that the media acts as a kind of
mortar in the way we come to share understandings with others; "It is the media
which are likely to forge the elements which are held in common with others, since
we now tend to share much of the same media sources and 'media culture'." (p.64)

Beyond the realm of cognitive forms of context, Morley (1992) in his analysis of the
audience takes the view that the context of viewing, i.e. the circumstances,
conditions, situations and place of viewing is in every way as important as what is
eventually viewed. To stress the point regarding the geographical dimensions of
contextual influence he suggests the phenomenology of 'going to the pictures' is
comprised of a diverse gestalt of elements, in which actually viewing and
interpreting the film is only one. The queue, deco, architectural structures, smell,
music, social contingencies, food, behaviours, protocols, light, comfort of the seat,
all contribute in forming the experience of 'going to the pictures'.

This is important as it draws attention to the fact that all practices and activities have
contexts that both enable and constrain them. In the previous chapter, I referred to
the meshing of complex social and technical elements in the creation of a 'whole'
network, constituency, configuration or system of co-shaping social and technical
elements. Such systems include sub-systems of innovation and learning, systems of
mapping, understanding and feedback, systems of action and iteration. From the
analysis of actor-networks or sociotechnical constituencies the final design product
which finally emerges to populate and engage the market can be clearly understood
as a product of many diffuse forces and tensions. But beyond the confidence of these
frames of analysis, the complexity of such processes blur control, and any attempts at
simple reasoning on behalf of managers and practitioners as to why some products
are successful whilst others are not (Rycroft and Kash, 1999).

026463 480
Nevertheless, one reasonably consistent determinant of the success of high-tech
products is cited as their usability - the design of 'ease', 'effectiveness', 'efficiency'
and 'satisfaction' of use. This is perhaps best understood using a Darwinian metaphor
– the notion of 'fit'. 'Fit' as applied to the instance of products or services, is when
their intended function and purpose matches that of their apprehended function and
purpose, to the delight of those who will be willing to pay for it. Usability as a
quality plays a significant role in this process of fit, as frustrations on the part of
those who use arise from disparities of anticipated function from actualised function,
or from poor representation of function in visible elements of its design, or even
through marketing, advertising and promotional activity.

To design usability effectively represents a real challenge, particularly if designers


have no concept or information regarding the contexts of use - how the product is
encountered, apprehended, interpreted and used by users.

And again, as suggested earlier, a focus on purely functional dimensions of a product


is no guarantee of 'fit'. Intangible and tacit elements can contribute to its
apprehension in the minds of those that use and consume. Usability as either aim or
experience can be said to only really exist as both a product of contexts, and as a
product which can only be apprehended within contexts - technological,
informational, historical, social, and individual. Most importantly it is both created
and apprehended also through practice – i.e. use and usage - expertise. It also relies
on the substance of the communication act itself, whether this means the features,
functions, texts or symbols denoting the designer's and producer's intended
purpose/s.

From a usability perspective it is the users' perception of product qualities which is


important, and it is these which need to be captured and translated to designers and
managers in meaningful ways, in order to produce relevant feedback into innovation
effort. Beyond simply imaginative or speculative views of potential needs and
requirements regarding a product, lie perceptions of use, consumption and users
informed through various kinds of research practice.

026463 481
Research practice is the means by which users and consumers are themselves
'designed' by firms, either as an "ideal" or "necessary" to "complete both the function
and vision embodied in the artefact." (Silverstone and Haddon, 1996: p.45) This can
often act to problematise the notion of user value and behaviours for guiding early
ideation and design, it can also be used politically within the firm to win senior
managerial support for projects, or even in validating the prospect of new
innovations to the wider public domain (Nicoll, 1999a).

Usability

My original role in the study of the Cambridge trial was to evaluate the usability of
the Cambridge system. Usability is an art and science that goes by a number of
different names; usability engineering, user-centred design, human factors
engineering, human factors, engineering psychology, and ergonomics (Wiklund,
1994). As an experiential aspect of a product usability has been defined as:
 The ease with which a user can learn to operate, prepare inputs for, and interpret
outputs of a system or component.373
 The effectiveness, efficiency and satisfaction with which a specified set of users
can achieve a specified set of tasks in particular environments.374

Usability has also emerged as a distinctive set of techniques originating from what is
more generally termed as the field of human-computer interaction (HCI) - an "area
of intersection between applied psychology and the social sciences . . . and computer
science and technologies" (Carroll, 1997); and ergonomics – which is described as a:
"study of work, and its environment and conditions in order to achieve maximum
efficiency."375

These have largely focused upon human factors - human cognitive and physical
capacities and limitations relative to the design and specifications of task, technology
and machinery. Part of the product of this research has been translated into

373
Institute of Electrical and Electronics Engineers (1990) IEEE Standard Computer Dictionary: A
Compilation of IEEE Standard Computer Glossaries
374
ISO (1991). International Standard ISO/IEC 9126. Information technology - Software product
evaluation - Quality characteristics and guidelines for their use, International Organization for
Standardization, International Electrotechnical Commission, Geneva.
375
Oxford Senior Dictionary

026463 482
guidelines and prescriptive measures regarding the size, positioning and other
characteristics of hardware and display items. The goal of HCI, or Human Factors
research is "to ensure that the systems produced by designers for people to use are
comprehensible, consistent and usable." (Maddix, 1990: p.9) The means by which
usability and other HCI studies achieve this is a focus on users. Carroll sees a pivotal
point in the early development of HCI as emerging in the work of Dreyfus (1979)
who shifted the focus in design practice "beyond the designer's need for prototyping
and iteration as a means to clarifying the design problem . . . to the user's knowledge,
experience and involvement to constrain design solutions." (Carroll, 1997: p.504)

Here lies the genesis of what is often referred to as user-centred style of design.
However it would be erroneous to suggest that this is the prevailing view as much of
the writing on computing from the mid-70s have been criticised as: "stunningly
dismissive of usability and rather patronizing of users." (Carroll, op cit., p.506) This
was not confined to computer system design as in the field of architecture a similar
notion was held that: "people are infinitely adaptable, that they will respond in the
way that they will give up normal tendencies to personalize the spaces in which they
work." (Lang, et al. 1974) More general history dictates that there is an in-built
tendency for people to subvert overly rational, unfair, or ‘outdated’ structures, rules
and regulations that are placed upon them, either physically or mentally.
Customising, making something one's own, possessing something, altering it in some
way so that it functions better is not just a common practice, but an essential practice
in helping products evolve and achieve success.376

376
Skinning refers to a current practice for firms and individuals to develop novel interfaces for certain
applications on a PC. One can change the look and feel of a browser by adding another 'skin'.
Advertising is getting involved with this practice so one could technically have a skin which
advertises a product:
"The urge that leads people to populate their bookshelves with action figures and their
refrigerators with magnets manifests on the desktop as a drive to customize . . . Functionally,
PCs are typewriters, banks, meeting rooms. The homogeneity that turned them into consumer
products has outlived its usefulness."
It has opened some debate, interestingly enough with respect to usability. Jeff Raskin, an Apple
interface designer, suggests that it was the standardised user interface, that "liberated computing and
delivered it to the people." "If users are embracing customization", he says, "it's only because they're
hungry for something better." He feels that skinners, "revel in hyper-personalization for its own sake."
Damian Hodgkiss, a 'skinner' counters: "Who's to say there's only one definition of usable? Why
shouldn't there be many?" They recognises that complexity, leads to discovery, and consequently to a
more involving experience." (source: http://www.wired.com/wired/archive/8.10/skins.html)

026463 483
Eric Von Hippel (1978) came to realise this in its occurrence within industrial
settings as a real opportunity for firms to learn from users how to improve their
products. But to capture such user innovation requires a particular organisational
capacity and a certain receptivity and openness that not all firms can afford. The
depth of user innovativeness is also something of note here. There is an even more
core process of discovery and shaping that users perform in realising the use,
usefulness and defining characteristics of technologies and their relation to service.

The telephone is a good example here. Bell's original patent idea was for a telegraphy
device, but the first real application was developed by Charles Williams Jr.
connecting his home to his factory. He was the first to use the telephone for
monitoring orders and governance at a distance. Also in April 1877, was the first
instance of the telephone being used for news reporting - one of Bell's lectures was
transmitted from Salem to the Boston Globe. Martin (1991) presents a vivid account
of early exploration of telephone by firms and network operators. 377 She writes that
the central office telephone exchange in Boston connected a local drug store with
twenty-one local doctors, and a proprietor of a burglar alarm systems installed
telephones so his customers could summon messengers and express services.

Use, utility and usability are intrinsically linked in the evolution of a new technology
such as a communication system. The interpretive flexibility of the telephone, its
openness to novel uses, eventually settled upon a commonly recognised utility. This
was understood first by business, and then by domestic users for a wide range of
communication acts, which were enabled by the transfer of voice (and later data)
over a communication network.

377
Most importantly she notes the distinction between ‘rational’ business uses, and ‘rational’ domestic
use. The latter perceived as being the connection of the businessman with his home, and his wife to
her suppliers. However, she cites numerous other examples of services which came to be presented
over the telephone such as entertainment, information (with operators giving out football scores and
addresses) and advertising (storekeepers phoning people unsolicited). Perhaps most fascinating of all
is the way in which the phone could “bring god to the sick and elderly” through preachers giving
sermons on the phone (p.136). This gave raise to a moral question at the time: “was a religious service
over telephone as sacred as one at church?” The spectre of ‘virtualization’ was clearly raised even
then. Are these the precursors of on-line services?

026463 484
Failure to capture users

Good usability as a characteristic of a product should mean that there is a reduction


in the learning curve required for the novice or experienced user to perform some
task or operation. It should allow more functionality with less effort. Most
importantly it should contribute to the production process by helping designers avoid
the need for expensive late fixes. By involving novice users early, they will likely
highlight the largest percentage of design overviews at the prototype stage, problems
can be rapidly realised and addressed before 'closing' the final design for
manufacturing and production. By involving more expert users at the same time,
should produce refinements that can ensure relevance or ‘fit’ for this group which
will probably manifest completely different needs and requirements. Realising
problems after this time, such as when the product diffuses into the marketplace
and/or reviewed by the media, may be very expensive, indeed impossible, to fix or
amend. As cited earlier Preece (1993) suggests that many system designers regarded
users as adapting to use of a system, 'like a cog in a machine'. Her suggestion here is
of a strong determinism or presumption on behalf of designers, and champions of the
user-centred perspective such as Donald Norman (i.e. 1988) have forced the point
that well-designed objects are easy to interpret, understand, and use because they
contain manifest cues for use (i.e. visual contextual data, highly representational of
function).

Conversely, poorly-designed objects are difficult to interpret, understand, or use


because they provide false cues, or at best misleading cues that entrap the user and
thwart the normal process of interpreting, understanding and utilising functions
(which may include poor contextual data, such as instructions, packaging etc. not
directly representational of function). Norman refers to the design and usability with
respect to visual interfaces in computing, and anyone who has used computers
usually has some story to tell of dysfunction accompanied by misleading or cryptic
messages or symbols (“application closed – error #244333, if this occurs again please
refer to your system supervisor”). While these may mean something to designers or
experts, this meaning will be lost on novice users. These are many cases of design in
products beyond computing which are overly-presumptive of the knowledge and

026463 485
interpretative capacities of users. Steve Woolgar (1991) has even suggested
particular designs (such as casing) which exist as acts of active exclusion. In other
words they exist to keep the user out, to prevent them from modifying, customising
or adapting with the penalty of relinquishing one’s warranty.

To be truly user-centred usability has expanded the remit of HCI and ergonomics to
encompass the cognitive and emotional aspects of using products. More recently,
usability research has expanded the scope of HCI and ergonomics research to include
emotional, perceptual and social aspects of use - aspects which have conventionally
come under the wider auspices of the social sciences as foci of study (March, 1994;
Logan, 1994). Identifying the emotional dimensions of products help develop the
product as a societal innovation (Cove and Svanfeldt, 1993), rather than an
innovation pertaining to certain market segments. It helps create a product that
represents the emotional link between this trend and the culture of the company
(Tharp and Scott, 1990). The prospect of design practice completely removed from
any active awareness on behalf of the user is accented in the emergence of interactive
and intelligent products. Here there is much more room for assumption of use and
utility in design than in the design of more 'orthodox' products - even interactive
interfaces. Such products may have few, if any, ostensible features or functional
controls, depending absolutely in their melding into the contexts and individual
circumstances of use, in a way which is totally congruent with the behaviours and
goals of its users (for a taxonomy of such products see Nicoll, 1998).

Put simply, they may work best only when they are unnoticed, sublimated into the
environments and contexts of use. Many design features of such artefacts and
technologies can only be fully appreciated, critiqued or evaluated when en situ - i.e.
integrated as part of a system (where they may be a component), or operating as
something which is used in a non-conscious, taken for granted way (such as a door,
tools, or medium).

"Japanese manufacturers are using the term "humanware" to describe design


which injects lifestyle into products and bases differentiation more upon
image and user requirements rather than function . . ."(p.10)

026463 486
Humanware is regarded by consumer electronic firm Sharp as the consideration and
design of products in terms of the total environment in which they will be used
(Thackara, 1988). The suggestion here of a design approach, intentionally infusing
products with symbolic as well as functional attributes, demands new sensitivities on
the part of designers and producers towards potential and actual use contingencies
and outcomes. Some recent examples of such methods and practice are techniques
such as sensory engineering - a holistic approach to design that accounts for
subliminal as well as emotional aspects of the product. For instance The Mazda
Corporation has recently employed this approach in car design, where the design of
cars is focused on the six human senses.378 Their object is to " . . . develop cars which
will appeal to the sixth sense, intuition, which is the most important for selling cars."
(Tatsuno, 1993) The prevalent attitude here is one of the 'sum of the parts is greater
than the whole' – in that design which caters for the five senses will invoke responses
on the intuitive level. Pine and Gilmore (1999) have described the experience
economy which refers to the differentiation of products through the design of
memorable experiences. Cooper and Press (1995: p.152) view that understanding the
consumer at such deep levels is a question of 'getting into their head':
". . . in order that they develop a conscious and subconscious understanding
of consumer needs, and translate that understanding into design features. . .
economic, cultural and occupational factors, peer group pressure, lifestyle
and psychological factors are all relevant. These need to be collected and
communicated in sensory terms to the designer."

In a very real sense smart products relate to networked-based devices such as the
Internet and interactive television in that usability of the networked devices or
services could be expanded to incorporate what is termed as 'back-office' activities.
Substituting 'intelligent functioning' for the various processing and shifting of
physical goods which make a transaction and its subsequent fulfilment possible, it is
possible to begin to consider the social and technical dimensions of such systems as a
whole.

378
Sensitivity towards a human-centred design in cars has been recently highlighted through various
manufacturers’ advertising campaigns drawing attention to the science or creative aspects of design.
One even emphasises this directly through promoting the statement “first man, then machine.”

026463 487
This is a real issue with respect to current views of Internet e-commerce sales, where
a site (or 'dot.com' firm) is valued over its entire set of representational, usability and
fulfilment features. For instance this may includes how good looking aesthetically
the site is; how 'sticky' it is (i.e. how long people are drawn by its content and
function to remain with it); how easy it is to navigate between its pages; how easy it
is to locate certain items; how easy it is to order them; and how long any order takes
to arrive at a person's home. In a sense beyond aesthetics or any intrinsic value,
effectiveness of a site depends largely on dimensions of time. 'Short-cut' features
such as Amazon.com's patented 'one-click' purchase mechanism, reduces the series
of heuristics that a user must step through to complete their transactions. This brings
the focus back to the object of the visit – i.e. the goods - rather then the mechanisms
and operational idiosyncrasies of the site's technical and navigational design.

Design of experience, innovation in use and consumption

Some commentators (i.e. Firat et al., 1995) now view reversal in the roles of
production and consumption as a distinguishing feature of recent society and its
economy. From this view production loses its privileged status in culture to
consumption, which has become the means by which individuals define their self-
images for themselves as well as others.379

Kanter (1992: p.9) for instance states that: "the information age discredits another
Industrial Age principle articulated by Karl Marx. Power stems not from control of
the means of production but from influence over the means of consumption." Firms,
their products, and intelligence are penetrating what was previously tacit, hidden and
private. And they are doing this ever more pervasively in homes, and also in the lives
and lifestyles of consumers. Due to this 'turn' it is becoming necessary for firms to
'get closer' to those they provided goods and services to in order to capitalise upon
their idiosyncrasies as consumers (i.e. through personalisation or customisation), and

379
Jean Baudrillard suggests that consumption is increasingly becoming a productive process, goal-
orientated, and purposeful, it requires that individuals be educated to carry out this process, it requires
special skills. (Baudrillard, 1988)

026463 488
also to understand more fully what constitutes the notion of fulfilling or satisfying
experiences for them. This entails both the experience of accessing and acquiring
goods and services (consuming), and learning of the experience of using the goods
and services themselves. In their attempt to establish a new foundation for design
Winograd and Flores (1986) state that:
"In order to understand the phenomena related to a new technology we must
ask about its design – the interaction between understanding and creating . . .
We address the broader question of how society engenders inventions whose
existence in turn alters that society. We to establish a theoretical basis for
looking at what devices do, not just how they operate." (p.4)

By populating domestic and mobile personal space by interactive communication


devices such as i-Tv and mobile communications, essentially increases the
information intensity arising from the relation between producers and consumers. For
designers and producers such technologies provide, using the correct applied
understanding, an entirely unique insight into what people are doing when they take
part in activities. Using these technologies they may now attempt to understand and
meet the 'on-demand' needs of a self-directed individualist consumer and capitalise
upon their drives for self-[re-]creation. Each technology – i-Tv, mobile phone, smart
card, credit card -can essentially act as a ‘lens’ providing a glimpse into behaviour,
and sometimes attitude, one which is nevertheless coloured by the interaction
potentials of that particular technology. But as explored at length earlier, the
feedback between [pre-existing] activities done through new technological means,
reflexively changes the nature of these activities. Convergence may now be taken to
mean computing, communications, consumer research and fulfilment collapsing as
techniques, and combining in technologies that become ever more closely melded
with human experience and perception. This opens the new imperative for
technology studies and research to study this reflexivity, between shifting structures
and boundaries and shifting activities.380
380
This was similar to the problems experienced in researching the Cambridge Trial itself. The
technology developed concurrently with the development of content and services, and with the
recruitment of trialists. Conditional access was relative to organisational developments. The role of
the trialists (whether they were designers or general public) also mattered regarding the lines of
questioning and the kinds of answer which could be expected if one was inquiring regarding
experiential matters to do with use. The usability of the system was also dependent upon the state of
the technological system and the style of content being produced to run on it. For example, there was
revision to the remote control design in the light of ergonomic concerns with certain content and
service programmes. While it served as a good controller for video-style material it fell short on

026463 489
It is clear that it is simply not enough to consider 'interaction' as symptomatic only of
the recent explosion in 'interactive media'. An example of this is offered by Pelly
(1996: p.23) who sees that a car; "is a design whose function blends everything from
technology to sensuous touch, a product that thrives on sound and light, as well as on
what is seen and felt, an experience that bridges the past and the future."

Many products and services provide or require an experiential framework -


participating, viewing, reading - that fosters a change processes within each person
(Pine and Gilmore, 1999). The product or service merely provides the arena for the
inner experience, resulting in a change in the person's inner state (more motivated,
clearer about goals, happier, etc.) In experiential design, it is this experience that is
offered to the market - and it is the experience, not the product or service, that really
meets the needs of the consumer-user. It must rather be considered from its basis in
cultivating human consciousness, culture and society, both from an evolutionary as
well as developmental perspective. The role of interaction must also be understood in
the design and development of what is best described as 'static' products, symbols
and artefacts. And then how it considers how it features in the design and
development of dynamic products and artefacts.381

For the near future, the biggest design challenge lies in developing systems that can
account for idiosyncrasies of use, idiosyncrasies of taste, idiosyncrasies of needs
(Araya, 1995, Kelly, 1995; Kurzweil, 1999).382 In many respects it would appear that
it is the essential and defining human qualities, and particularly those needs which lie
usability for online gaming.
381
Static products are probably best represented by tools, which rely on a relatively stable
functionality. Graphic art and packaging, and ornaments best represent static symbols and artefacts.
One could class ‘smart’ or ‘intelligent’ products as dynamic artefacts and television programmes and
Internet web pages as dynamic products and symbols. As Fiske (1991: p.58) has it:
“In one hour's television viewing, one of us is likely to experience more images than
a member of a non-industrial society would in a lifetime. The quantitative difference
is so great as to become categorical: we do not just experience more images, but we
live with a completely different relationship between the image and other orders of
experience. In fact, we live in a postmodern period when there is no difference
between the image and other orders of experience.”
382
"The challenge for the next decade is not just to give people bigger screens, better sound quality
and easier to use graphical input devices. It is to make computers that know you, learn about your
needs and understand verbal and non-verbal languages.” Editorial - Inside Multimedia IM, No 95,
June 12, 1995

026463 490
at the foot of the hierarchy - the ancient and basic human needs - that will deny the
logic and notion of 'perfect provision' by design, intelligent systems, networks and
electronic means. Human unpredictability is the chaos that upsets any notion of
overly rational approaches to planning and design. It is always the loose cannon for
those who wish to anticipate, force, manage or order the behaviour and minds of
others.

Drawing the components of the use process together

The object of the discussion so far has been to show that by considering
technologies, and in particular their features and functions crystallised as texts,
contexts may come to be highlighted that otherwise would remain hidden or tacit.
The production and diffusion of features and functions, considered as acts of
communication between designers and users, can highlight disparities in their
perceptions of products. This can provide clues as to the reason why some
technologies come to be unsuccessful, whilst still manifesting strong competitive
qualities such as competitive pricing or good usability.

Conversely, in studies of the user-consumer, investigation of their interpretation of


technologies, their features and functions, highlight how varied are the kind of
attributes associated with technologies and the ways by which the technologies come
to be characterised. Such a view is useful indeed, as it is suggestive of the kind of
epistemological symmetry applied in studies in the Sociology of Scientific
Knowledge (Bloor, 1976). It denies privileging any form of determinist notions
(technological, economical, or social) or any 'truth claims' regarding what a
technology 'is', 'does' or 'can be'. Moreover, it denies simplistic cause and effect
technology or service - there clearly exists a number of other properties and
contingencies which significantly contribute to how and what a product 'is'. These
properties are the culmination of actual encounters with the design product, its
features and functions, in some manifest state, or as a result of some other kind of
discourse such as advertising, discussion, imagination, predication and so forth. CU383
383
I have termed this approach ‘contextual usability’ although it could equally be called ‘contextual
use’; ‘contextual usefulness’; or “contextual usage”. ‘Contextual usability’ does emphasise however
that the other elements of the use process come to bear in the creation or impression of usability – the

026463 491
proposes that the use process may be decomposed into four main dimensions that
include:
 The individual circumstances and situations of use – the phenomenological,
socio-cultural, economic, individual, sensory and psychological contexts
which lead, motivate and otherwise create the particular and individual
conditions of use. (i.e. retail shop floor, at home, during a birthday party etc.)
(i.e. Belk, 1975).384
 usability – the ease of use of a device, service, product for novice and
advanced users alike385
 usage - contexts - social, cultural, technical, other interests, pursuits and
activities etc. - which pattern, constrain and sustain periodic use or
consumption.
 usefulness - the value gained from integration of the technology within an
individual's lifestyle and activities. The quality of usefulness is again
ease or ‘fit’ of use. In a sense their anticipated or actualised realisation are imperative precursors for
usability to be a desirable and necessary quality of a technology (i.e. marketing claims of good
usability are rarely used to sell product, whereas a promise of usefulness might). Also, usability is the
single element in the use process that can be contrived by designers and producers. The others, while
being suggested by marketing and advertising, are more ‘open’ texts, capable of engendering a range
of impressions and beliefs regarding the product’s ability to add value to current and future activities.
Usability has to be realised and verified by consumer-users in the lived processes of apprehension,
consumption and use.
384
Belk (1975: p.157) considers that “situations and behavioural settings are subunits within an
environment . . . Situations represent momentary encounters with those elements of the total
environment which are available at a particular time. Behavioural settings - derived from Barker
(1968) - are not only bounded in time and space but also a complete sequence of behaviour or an
“action pattern.” Examples Belk offers are a basketball game, or a piano lesson. These are behavioural
settings since each involves an interval in time and space in which certain behaviours can be expected
regardless of the particular persons present. The study of use also focuses more on the experiential
qualities of use – from the perception of the consumer-user - in a particular situation, environment and
circumstance. This may included how they view these situations, environments and experiences. The
key difference here between use and usage lies in the difference between “momentary” use situations
and environments, and periodic patterning of use in time and space (usage). Whereas, the quality of
use may include elements which are obvious to the consumer-user, usage focuses much more upon
behavioural variables, often not registered by the conscious awareness of the consumer-user. Also
particular situations, environments and circumstances may promote or abate periodic use and
consumption.
385
Beven et. (1991) all offer three views which relate to how usability should be measured:
• the product-oriented view, that usability can be measured in terms of the
ergonomic attributes of the product;
• the user-oriented view, that usability can be measured in terms of the mental
effort and attitude of the user;
• the user performance view, that usability can be measured by examining how
the user interacts with the product, with particular emphasis on either
- ease-of-use: how easy the product is to use, or
- acceptability: whether the product will be used in the real world.

Bevan N, Kirakowski J, and Maissel J. (1992)In: Bullinger H.J. Proceedings of the


4th International Conference on Human Computer Interaction, Stuttgart, September
1991. Elsevier.
These views are complemented by the contextually-oriented view, that usability of a product is a
function of the particular user or class of users being studied, the task they perform, and environment
in which they work.

026463 492
contingent on multiple contexts including user's relations with existing and
previously possessed products (i.e. Dickerson and Gentry, 1983), the private
and public meaning of the product to the user, novelty, conspicuous benefits
from using, etc.

I define these as essential distinguishing elements underlying either motivations to


design, use or consume technology (see below).386

Resistance
USE USABILITY
Individual situations and
circumstances of use How easy and satisfying the
product is to access and use
USE PROCESS

USEFULNESS USAGE
How valuable the product with respect to The patterning of use
existing and evolving activities

Fig. 3.1 The elements of the use process (arrows suggest interdependency of use elements,
arrows circumnavigating suggest procedure of use, from first use to establishment of use.
Usability of the technology resists use)387

The model shown above suggests an argument that the qualities of a technology or
product, such as its usability co-exist with, or are, co-dependent upon the other
aspects of the use process. For instance there is an obvious, common-sense link,
between the usefulness of a product and the development of usage or consumption
patterns.
386
It is also important to note that other use terms such as ‘useless’, ‘abuse’, ‘used’ and so on also
have unique bearing and meaning at the level of analysis as represented here. However, I do not
consider them here.
387
Usability of a technology resists use within a number of different use situations. The novice user,
for instance, using a new machine on the shop floor, may encounter problems accessing and
operating. This obviously impacts upon their developing notions of the machine’s usefulness, as well
as their anticipations of how the machine can integrate into their lives. The ‘expert’ user, who has
successfully incorporated the machine into everyday life, may similarly derive dissatisfaction if the
machine does nor permit sufficient customisation which allows for their increased expertise in using -
i.e. ‘hot’ buttons which allow for faster accessing of functions or content. Each of these represents a
problem of the machine’s usability.

026463 493
There are also strong links between usefulness and usability, when one considers
early spreadsheet packages. These had notoriously poor usability and were difficult
to operate. However their utility and perceived value encouraged early users to
overcome the problems of usability. Usability - the ease of use of products - cannot
be divided from desire to use, reasons to use, and the socio-economic contexts and
conditions of use (see table 3.2 below). This is particularly true in the use of media
devices that are intended for recreational and informational purposes, consumed
within the home and used within people's leisure time.

Table 3.2 The use process


Use process Properties How Illustrates
Captured
Use Individuated use From the Situations, contexts and environments
situations perception of leading to use, demographics,
the consumer- psychographics, goals desires, needs and
user, observable requirements
behaviours,
biographies
Usability Ease of use From the Attitudes towards technology
perception of
the user- Expertise of using technologies generally
consumer
Problem solving style
Usage Patterned or Registration of Social factors – i.e. regimes and rules
periodic use the system regarding consumption and use in particular
household

Economic factors – i.e. cost of using

Consequences of using
Usefulness Relation to Registration of Relation of the technology to other pursuits,
lifestyle and the system and interests and tasks, outside of use
activities perceptions of
outside usage the consumer- Pleasure deriving from intrinsic motivation
user to use

Consequences and outcomes of using

Symbolic 'potency' of the technology

Use
Technologies and their use are not completely open affairs but, as already been
argued at some length, are constrained by various physical, social and cognitive
exigencies. Wittgenstein (1953) goes on to show how language use, not some rigid

026463 494
set of rules, determines meaning. Nevertheless, many continue to search for the
vacuum bottle ideal for language: "We think it [the ideal] must be in reality; for we
think we already see it there." (p.101) In a similar way, we cannot specify the pure,
or ideal, case for the use of an innovation, only its idealisation in the minds of the
developers. Users inevitably interpret an innovation in their own distinctive ways,
applying it idiosyncratically to their own contexts, and even re-create it to satisfy
their own needs.

Donald Norman (1988; 1991; 1992, p.19-25) for instance speaks of 'affordances'. Or
rather "perceived affordances," which emphasise the actor’s perception of an object's
utility as opposed to focusing on the object itself. These are crystallised, inflexible
(physical, ergonomic and interpretative) properties of objects. A table affords placing
objects on it. A baseball affords throwing but does not afford sitting on. Spoons
afford eating soup. So the blind man's 'reading' of the world is intricately tied to the
affordance of his stick. We can talk of these properties of artefacts that extend our
minds as affordances. Within the realm of affordances, I argue lie the characteristics,
attributes, features and functions that relate to - and shape - our intentions and our
social and cultural dispositions. Michael Cole (1991) recounts experiments in which
adults bounce infants dressed in blue diapers but treat infants in pink diapers gently.
Here the cultural expectations of the parents are already putting constraints on the
child's future experience of the world.

Among the factors found to have a significant impact on technology use or


implementation are the perceived usefulness of the innovation, individual
innovativeness, and training, (i.e. Alavi & Joachimsthaler, 1992; Davis, 1989;
Leonard Barton & Deschamps, 1988; Schiffman et al., 1992). In a study of the
relationship of attitudes and usage, Grantham and Vaske (1985) examined attitudes
toward voice mail and extent of use. The researchers concluded that individuals'
attitudes toward the voice mail system was the strongest predictor of extent of use
and explained more of the variance than any of the other independent variables. In a
study of the implementation of an intelligent telephone system, attitudes such as ease
of use, the status attributed to using the new phones, and the complexity of systems

026463 495
features were not significantly correlated with extent of use (Manross & Rice, 1986).

Questioning or analysis (of transcribed text) here would be in the form: Who [uses],
what [service], why [for which reason], and where [public or private use]? The
investigation could be concerned with establishing which household member uses
the system, what they use, for what purpose, and where within the home is it used.
Put simply use unpacks into the following lines of exploration/questioning:
Who?
This would focus on the individual characteristics and socio-biographical
details of the individual who comes to use the technology, product service
etc. It may also contain demographical or psychographical information.
Further it may seek to identify traits of the social group to which the
individual belongs.

What?
Not only are current on-line services dominated by males, there has also been
the prominence of some services over others (i.e. sexual bulletin boards).
Also, following who uses, what type of person is attracted to interact with
media and information? As an example Ann Gray (1987) has shown that
women often will hire videotapes for family use, but rarely do for their own
pleasure. How will this translate through the availability of video on demand
(VOD - the ability to order videos on-line)? There is currently a
predominance of male subscribers to the current Internet services, could this
show a gender predication for generally interacting with media and
information? Other demo- and psycho-graphic indicators may also feature
here, the question being who may be attracted to use, what [service]?

Why?
Why is, or why would a person be drawn to use? What situations gave rise, or
give rise to use? Not only is this a major marketing question (concerning for
instance, a person's initial impetus to purchase), but one which concerns
household and family social dynamics. How would certain household
members communicate and 'sell' the idea to the others? How would
anticipated benefits compare with actual use of the system, from an early-
adopter (the advocate of the system, and the one who 'sells' it) and a
technologically-reticent person's (i.e. a household member who had to be
'sold' the idea of acquiring a system) perspective.

Where?
This concerns geography and physical situation. Currently, where in the
house are media technologies situated. Which are in the public spaces (i.e. the
living room); and which are situated in private spaces (i.e. in a bedroom, or
workroom). Domestic research (i.e. Silverstone and Hirsch, 1992) has
illustrated that within households there may be more than one unit of a
particular media technology in a household. Not only is this relevant to

026463 496
assessing individual household members media consumption (i.e. usage
patterns), but where the media technology is situated may also indicate
something of the particular modalities of operation, as well as particular
tastes and preferences for particular services.

By no means exhaustive my intention here is to suggest something of the complex


nature of this one particular dimension, before moving to the other major use process
comments. Use, although very dependent upon context, is also dependent upon
situation and circumstance. Stanley Fish (1979: p.251) argues that we "are never not
in a situation." Because of this we are "never not in the act of interpretation." Use is
always situated, and therefore always a matter of interpretation (or interaction).

Usability

Here usability relates more specifically to how easy the product is to use from the
consumer's perspective, both in terms of initial confrontation with the product
(immediate accessibility i.e. point of sale), as well as sustained and continued use
over time (sustained usefulness i.e. after domestication). Immediate accessibility of
the system may be seriously hindered or 'resisted' by bad usability, thus giving rise to
lack of 'attractiveness' to the consumer. However, the constituents of immediate
accessibility and attractiveness, may be quite different factors from those
contributing to sustained usefulness. A striking graphical or audio effect, may
encourage and impress the consumer on the retail store floor, however it may jar the
nerves, even annoy the persistent user. 388 According to Knowles (1988), any
technique to evaluate usability is itself not very useful if it has to be restricted to a
limited type of task environment. The laboratory is a very limited type of task
environments, although usually designed to be very convenient for the purposes of
experimentation. Is a laboratory-defined construct truly applicable in the real world
and will it find the most important aspects of system usability?

Usage

It has been suggested by a number of studies that individuals' attitudes toward the
usefulness of an information system are positively associated with its extent of use
(Davis, 1989; Grantham & Vaske, 1985). Davis (1989) suggests that usage - the

388
As indeed was evidenced in the case outlined later.

026463 497
time user spends with the system - can be used as a surrogate for usability. The
inference is that the more time a user spends with a system, the more usable he or she
finds it. There are several flaws in this way of thinking. For instance, the system may
be so complicated that the user has difficulties in getting any results and
consequently spends time in vain, or the system usage may be obligatory in some
sense. Actually, even the usage time is somewhat vaguely defined in the study, since
its measuring is up to those who respond to the questionnaires. Davis suggests that
according to the regression data, ease of use may actually be a causal antecedent of
perceived usefulness, and not a direct determinant of system usage.

The research setting described above poses also another question. How does a user
build perceptions of a technology without first using it? If a motivated user tries to
learn a computer system, initial difficulties encountered in the system will not hinder
him or her from utilising it. A partial answer to the rhetorical question posed above is
presented in Nielsen et al. (1986): a user views a system as approachable or
unapproachable, depending on the amount of learning needed before the system is
usable. All the factors affecting approachability are not known, but e.g. the size of
the manual matters. Clearly, approachability would have to be included in the
analysis of factors affecting perceived usability.

Most researchers who have studied the factors associated with successful technology
implementation frequently quantify implementation by the amount of use, such as
how many times per day or per week the technology is being used. Yet such
quantification may not be an accurate measure of the quality of the implementation
since the amount of use does not indicate whether the technology is being used to its
fullest potential (Hiltz & Johnson, 1989; Rice & Case, 1983; Rice & Shook, 1990).
Findings from studies that hypothesise about factors influencing technology
implementation may lead to an incomplete view of implementation. This is
especially true when amount of time or frequency of use is the sole measure of
implementation because this quantification does not indicate whether individuals are
taking advantage of the technology's potential (Rogers, 1995; Stewart, 1992). This is
the patterning and formation of habits concerning use [who, and at what times].

026463 498
While usage and use are linked, usage is differentiated from use through its
dependence on the social conditions and contexts affecting use. For instance; is there
a family censoring policy? How does it effect the individual household members?
Does the system interfere with the use of other media? Does it interfere with other
leisure time activities? Does the household agree on dividing usage time? Usage
forms an important aspect for study concerning the consumption of services. Parallel
examples may be product 'shelf-life' or the playability life of a computer game or top
twenty single. There are only so many times a person may play a game or a record. It
is anticipated that some i-Tv services will change while others will be a more
permanent feature.

Usefulness

Usefulness as used here has two facets: The first is a product's apparent or
anticipated usefulness, such as that perceived via the depiction of use in
advertisements, or perhaps through seeing a neighbour or a member of the family
using a product or service. This facet of usefulness relates to a consumer's initial
impetus to appropriate and purchase. A second facet of usefulness is the embedding
of the system into the individual lifestyle and everyday activities of household
members. This facet relates to the idea of a product of service’s domestication and of
the notion of sustained usefulness. Usefulness is also something which may become
tacit – i.e. its use and value are so familiar that it is only when the technology or
service breaks down that the value of its use is fully realised.389

Chapter discussion

In this chapter, following Woolgar (1991, 1996), Mackay (1995) and Mackay and
Gillespie (1992), I have discussed and proposed a hermeneutic view of technologies
where their characteristics, features, attributes and functions may be treated as 'texts'.
They argue that the phenomena of investigation are the discursive and interpretive
practices through which such artefacts as machines and tools are constructed. To
suggest that machines are texts is therefore to deconstruct definitive versions of what

389
Breakdown relates strongly to the notion of readiness-at-hand. By breaking down we become aware
of the hammer, the television set, or other technology; "[the hammer's] 'hammerness' emerges if it
breaks or slips from grasp or mars the wood, or if there is a nail to driven and the hammer cannot be
found." (Winograd and Flores, 1987: p.36)

026463 499
machines can do. This sets the frame for an examination of the process of
construction (writing) and use (reading) of the machine. The relationship between
readers and writers is understood as mediated by the machine, and by interpretations
of what the machine is, what it is for, and what it can do [teleology].

Betti (1980) suggests that misunderstanding occurs with increases in space and time
between 'authors' (which is suggested as analogous to designers and producers) and
'reader' (analogous to consumers and users). Following Tatsuno (1993) if the real
object of user or consumer research is for firms to get 'closer' to their customers, then
this requires firms to attain an understanding of how their products, in both physical
and symbolic aspects, situates in the user's experience.

Put in hermeneutic terms this means fusing the horizons between product and user,
between producer and consumer. Gadamer (1979) employed the notion of horizon in
both a literal and figurative sense to suggest everything that is visible from a
particular vantagepoint. It is in the fusion of horizons that the subject-object
dichotomy is transcended. In the instance of technologies and people I suggest that
this occurs in the instances and the practice of use. It is here that 'cultures of
production' and 'cultures of use' interact, it is the point of most information.

The horizon of the user is their [pre-]understanding. The horizon of the technology is
its sense discerned through semiotic-structural analysis and progressive iterations of
the hermeneutical circle (Arnold and Fischer, 1994). The user's horizon is finite, but
neither limiting nor closed. When the user moves or changes position through
developing understanding or expertise, his or her horizon also moves. Horizons once
fused never 'unfuse', as when one disposes of a product. The expertise and
experience acquired from using or living with the product lays the foundation (the
'departure point') for the relation of new functional or feature requirements from new
products. Following this rational, no user considers a product or service in pure-play
terms such as use, usability, usage or usefulness. The experiences, circumstances,
and scenarios where they are experienced are individual and they are unique even
though they may share characteristics (i.e. responses such as the television or

026463 500
computer used for amusement- laughing, education - learning, passing time, or baby
sitting).

Returning to the theme of 'culture of production' it would remain that path-


dependencies, such as more general industrial trends and developments, form a
considerable aspect of the [pre-] understanding of the horizons of designer and
producers, lending influence on design decisions. Meanwhile, apprehension of
general trends and fashions in the market place will form some aspect of the [pre-]
understanding of consumer-users, lending influence upon purchase and acquisition
decisions. New components, new technical standards and new software techniques
combine with desires to incorporate new fashions and trends in the look and feel of
functions, casings and interfaces. These combinations find their way to the consumer
marketplace where they undergo a further calculus of value and meaning made this
time by user-consumers. Path dependency in use arises through the culmination in
expectations and anticipations gained through the use and consumption of previous
products.

But like written or spoken texts there is not an infinite range of interpretation that can
be put to technology, and its functions. Grint & Woolgar (1997) are critical of a
perspective in which there is the implication that there; "remains, at the centre of the
technological system, a residual, non-social or neutral machine which is malleable
according to its social location/context, etc." (p. 14) Indeed, 'social shapers' stress
that "technologies can be designed . . . to open certain options and close others."
(MacKenzie and Wajcman, 1985: p.7). As MacKenzie (1995) offers using the
terminology of media and audience research (i.e. Morley, 1992), technologies are
'polysemic'. This suggests there is always a 'preferred reading' which denotes that a
toaster is used – or indeed is most useful - for making toast, or that a refrigerator is
most useful for keeping things cool and a tin opener is best used for opening tins and
so forth. This notion of 'best used for' - i.e. the purpose of something - may be linked
very strongly to the notion of good usability. A tin opener may have good usability
for opening tins but poor usability for opening wine bottles.

026463 501
Together with the more immediate and idiosyncratic design challenges occurring in
the project at hand, more macro-level trends contribute towards, as well as guide and
shape final products (which is largely the position of sociotechnical constituencies).
But throughout such processes what becomes immediately obvious to producers and
designers as mistakes may not explicate the full gamut of mistakes or limitations that
will be experienced by consumer-users. Consumer-users come not only with fresh
eyes to the product, but their interest in the product is derived from a unique gestalt
of influences, motivations, expectations and requirements, which often deny the
simplistic models that were used to represent their behaviours and psychology
throughout the design process.

Kantor (1999) has drawn attention to this when she suggests: "Producers worry about
visible mistakes. Customers are lost because of invisible mistakes." (p.9) Indeed
what is considered a failing or 'success' within the context of design, may not map in
any real sense to success or failing at the stages of consumption and use. This is
evidenced in the multitude of unusable, or unused, features and functions
incorporated into many new consumer electronics and software applications, or the
unprecedented success of something which was considered merely a novelty.

'Obvious' defects, deficiencies and shortcomings are realised quickly in products


which are established and well-known. Any global weakness, flaw or incapacity will
be usually noticed early on in their development. The use and functions of such
products are well defined, possessing a common-sense style of logic to their purpose,
a logic usually closely shared by designers and users. Indeed, such products evoke
with ease particular expectations and anticipations – from the designer's perspective
concerning the product's actual and prospective users- and from the consumer-user's
perspective concerning the product's performance and utility. The experiential
familiarity of such products suggests, clear objectives (what I have termed as
'departure points') for improvement to designers and innovators. For instance, in the
case outlined later, the problem of font size for reading text on a television screen
opposed to a computer screen, was realised early on in the development of the
Cambridge system. Comprising the first group of trialists were the interface and

026463 502
system designers. It was only through operating the system from their home that they
came to realise a serious flaw. Using a desktop computer to design the interfaces,
they had underestimated the necessary font size that would allow screen-based text to
be read from a distance (i.e. viewing text on one's television whilst sitting on the
living room settee).

Familiarity not only provides a sound foundation for comparison, serving to indicate
and benchmark improvement and differentiation, but also where one should focus
attention in processes of fault-finding and fixing. It suggests the heuristics one may
go through in using the product and in fault finding.

New, radical or discontinuous innovations, by their very nature, often deny any
common sense or shared purpose and function, so here there is a greater dependence
upon the creative resources of producers and designers. They must use their powers
to project and anticipate value and usage. Litva (1997) sees that when this
management-endorsed information is deemed inadequate, designers supplement it by
creating and sharing their own customer-related information. The onus is upon them
to introduce to the wider public domain something useful and desirable. Such
projects are invariably risky. Where there is no near product analogue, value (or
usefulness) is something which relies inherently upon imagination, anticipation and
rhetoric which has to be built (through marketing , word of mouth, etc.) around the
product.

In these cases there is a greater propensity for products at the design stage to
manifest non-obvious defects. These can include functional problems, redundant
features, or even questions of taste or fashion. Each has the potency to make or break
a new product's launch, or to create a window whereupon a competitor can rectify
and move in on one's first mover advantage in the marketplace.

The issue here is that path-dependencies exist both in design and in use. They may
guarantee some level of continuity, and suggest forecasts, but they do not guarantee
customer delight, which often springs from novelty, delight or surprise. These can

026463 503
only be found in the process of consumer-users developing new kinds of useful and
valuable relations with a new technology and its function.

Conclusion

In the area of technology development Rycroft and Kash (1999: p.8) define 'complex
technologies' as those processes or products which:
" . . . cannot be understood in full detail by an individual expert sufficiently to
communicate all the details of the process or product across time and distance
to other experts. A simple process or product is one that can be understood
and communicated by one individual."

This is an era of complex reasons, purposes and complex outcomes. Complexity in


technology, society and the economy lead Rycroft and Kash to make the claim that
corporate CEOs and managers; "don't understand why they are successful." (op. cit.:
p.3) They suggest that success in our complex world is clearly dependent on ever-
changing relationships among the organisations that participate in continuously
adapting organisational networks. In many cases, there is no point in explaining the
technical specifications to a person who is simply seeking entertainment. However a
technically informed enthusiast may only be impressed by a list of specifications
above and beyond actual performance. However, in the longer term end experiential
aspects will become dominant. Nevertheless, 'visions' [as 'departure points'] vividly
illustrate ways in which the function and potentials of a technology may are linked to
the possibilities of use. They show how social explanations of use can be used at
once to leverage rationales for innovation, and subsequent acceptance by consumer-
users. Buchannan (2007) asks us to incorporate a rhetorical approach to considering
design. One which “answers that products are persuasive in three ways that are
intimately related and interconnected:” These are namely:
1) Useful: “product must be effective in the task for which it is designed.”
(ibid., p.64)
2) Usable: “when they are easily used. Discovering this aspect of the product
involves such matters as ergonomics, cognitive study, and cultural factors, all
of which help to shape usability” (Ibid., p.65);
3) Desirable: “when users can identify with the product and are willing to
take it into their lives as an extension of their identify and self-image.
Discovering this aspect of a product involves finding an appropriate ‘voice’

026463 504
or ethos of a product – a character that is consistent with the perceived
character of the intended user or community of use.” (ibid, p.65)

Words and actions are the means by which people relate their experience of the
world to others. Words and actions constrain, shape and give shape to things. Some
views and some behaviours remain tacit to the individual consciousness. I suggest
CU as a frame through which to study the person's whole experience of a product.
While not tied to interpretivist study, and a hermeneutical perspective of
'technologies considered as texts', it certainly draws attention from this approach in
discerning the privileged role that rhetoric plays in the consumption and experience
of [particularly media-] technologies and products. As Gadamer (1976) has it:
"Thus the movement of understanding is constantly from the whole to the
part and back to the whole. Our task is to extend in concentric circles the
unity of the understood meaning. The harmony of all the details with the
whole is the criterion of correct understanding. The failure to achieve this
harmony means that understanding has failed." (Gadamer, 1976: p.117)

It most importantly stresses the individual influences that from perceptions and
anticipations of products, 'read' through encountering characteristics, attributes,
feature and functions, in the physical product or via some form of representation.
Hannah Arendt speaks on the foundation of commonality between people:
"Human plurality, the basic condition of both action and speech, has the
twofold character of equality and distinction. If men were not equal, they
could neither understand each other and those who came before them could
not plan for the future and foresee the needs of those who will come after
them. If men were not distinct, each human being distinguished from any
other who is, was, or ever will be, they would need neither speech nor action
to make themselves understood. Signs and sounds to communicate
immediately, identical needs and wants would be enough." (Arendt, 1958,
and quoted in Jones, 1991: p.v)

The main theme of chapter has proposed two things. It follows the suggestion that it
is useful to consider 'technology' as text. But it would rather decompose 'technology'
into further 'sub-texts' one of which is usability. This would include discussions
regarding the use dimensions of a product decomposed into use, usability, usage, and
usefulness. Usability then, understood as a text, is contextualised within a variety of
other texts that contribute to the apprehension of whole product, service, and
technology. Each 'text' has both tangible and symbolic qualities. While usability may

026463 505
be elucidated by neo-empiricist study, a text/context view of usability points towards
consideration of the whole experience which constitutes the product, and it can only
be 'read' or comprehended as a gestalt of meanings and values.

Chapter 4 – On Method
" . . . the Greek civilisation . . . favoured intuition, insight and the intellectual
processes, but not the extraction of secrets from nature by mechanical contrivance
and experimental technique. This was not to come until almost 2,000 years later."
(Boring, 1950: p.98)

One must seek a way of casting light on the fundamental question of ontology, and
this is the way one must go. Whether this is the only way or even the right one at all,
can be decided only after one has gone along it. The conflict of the Interpretation of
Being cannot be allayed, because it has not yet been enkindled. And in the end this
is not the kind of conflict one can ‘bluster into’; it is of the kind which cannot get
enkindled unless preparations are made for it. Towards this alone the foregoing
investigation is on the way. (Heidegger 1962: 487-488 [437], emphasis as in
original)

Possibly we're in the process of experiencing a new relationship between theory and
practice. At one time, practice was considered an application of theory, a
consequence; at other times, it bad an opposite sense and it was thought to inspire
theory, to be indispensable for the creation of future theoretical forms. In any event,
their relationship was understood in terms of a process of totalisation. For us,
however, the question is seen in a different light. The relationships between theory
and practice are far more partial and fragmentary. on one side, a theory is always

026463 506
local and related to a limited field, and it is applied in another sphere, more or less
distant from it. The relationship which holds in the application of a theory is never
one of resemblance. Moreover, from the moment a theory moves into its proper
domain, it begins to encounter obstacles, walls, and blockages which require its
relay by another type of discourse (it is through this other discourse that it
eventually passes to a different domain). Practice is a set of relays from one
theoretical point to another, and theory is a relay from one practice to another. No
theory can develop without eventually encountering a wall, and practice is necessary
for piercing this wall. For example, your work began in the theoretical analysis of
the context of confinement, specifically with respect to the psychiatric asylum
within a capitalist society in the nineteenth century. Then you became aware of the
necessity for confined individuals to speak for themselves, to create a relay (it's
possible, on the contrary, that your function was already that of a relay in relation to
them); and this group is found in prisons -- these individuals are imprisoned.
GILLES DELEUZE, 1972 - http://libcom.org/library/intellectuals-power-a-
conversation-between-michel-foucault-and-gilles-deleuze

"If every product is really a service, then every contact or communication with
customers is also the product . . . Customer-focussed slogans and posters, even
interfunctional problem-solving teams, are not good enough if departments do not
share the same priorities or rewards." (Kantor, 1992: p.10)

Alice was beginning to get very tired of sitting by her sister on the bank, and of
having nothing to do: once or twice she had peeped into the book her sister was
reading, but it had no pictures or conversations in it, 'and what is the use of a book,'
thought Alice 'without pictures or conversation?'

— Lewis Carroll's Alice's Adventures in Wonderland

’How different things would be … if the social sciences at the time of their
systematic formation in the nineteenth century had taken the arts in the same degree
they took the physical science as models’

Robert Nisbet, Sociology as an Art Form, 1976, p. 16).

… Be a good craftsman: Avoid any rigid set of procedures.. Avoid the fetishism of
method and technique…Let every person be their own methodologist; let every
person be their own theorist……

C.Wright Mills The Sociological Imagination 1959 Appendix

Peter Drucker was right when he wrote: “What gets measured gets managed.” But why is this so often
taken to a false corollary like “What can’t be measured isn’t worth managing”? So much of life cannot
be measured yet is still lived and enjoyed. Is there a way to calculate the ROI on raising children? If
there is, I for one would like to know it. (Yes, I know all about the human capital equations of
economists, but I’m talking about real life here.) What Can’t Be Measured
Larry Prusak OCTOBER 07, 2010 Peter Drucker from his 1954 book titled, “The Practice of
Management.” The quote has been repeated so many times since then that it has almost become a
cliche. Bestselling author Tim Ferris, in his book “Four Hour Body”, recounts the story of a man who
lost weight merely by being aware of it every day. He started by drawing a huge graph with time on
the x-axis and his weight on the y-axis. Then he marked his current weight and the weight he wanted
to be at after two years. Next, he joined these two points with a straight line, and drew two parallel
lines, one above and one below this straight line. These two parallel lines marked the region he
wanted to be in. Every day, he would weigh himself and mark the corresponding point on the graph.
Whenever he got a mark close to the line on the top, he would feel motivated to eat less and exercise.
That’s how he lost 10 pounds in 2 years.

026463 507
Measurement serves two purposes:

It makes you motivated. It is in human nature that whenever he is given a score based on something
he did, it motivates him to improve his score.
It gives you a clear idea about the reality and helps you make effective future strategies.

the Edinburgh Reviewers protested, after the manner of Locke, that no good could
come of a system which was not based upon the principle of Utility.
"Classical Literature," they said, "is the great object at Oxford. Many minds, so
employed, have produced many works and much fame in that department; but if all
liberal arts and sciences, useful to human life, had been taught there, if some had
dedicated themselves to chemistry, some to mathematics, some to experimental
philosophy, and if every attainment had been honoured in the mixt ratio of its
difficulty and utility, the system of such a University would have been much more
valuable, but the splendour of its name something less."
Utility may be made the end of education, in two respects: either as regards the
individual educated, or the community at large. In which light do these writers regard
it? in the latter. So far they differ from Locke, for they consider the advancement of
science as the supreme and real end of a University. This is brought into view in the
sentences which follow.

"When a University has been doing useless things for {161} a long time, it appears at
first degrading to them to be useful. A set of Lectures on Political Economy would
be discouraged in Oxford, probably despised, probably not permitted. To discuss the
inclosure of commons, and to dwell upon imports and exports, to come so near to
common life, would seem to be undignified and contemptible. In the same manner,
the Parr or the Bentley of the day would be scandalized, in a University, to be put on
a level with the discoverer of a neutral salt; and yet, what other measure is there of
dignity in intellectual labour but usefulness? And what ought the term University to
mean, but a place where every science is taught which is liberal, and at the same time
useful to mankind? Nothing would so much tend to bring classical literature within
proper bounds as a steady and invariable appeal to utility in our appreciation of all
human knowledge ... Looking always to real utility as our guide, we should see, with
equal pleasure, a studious and inquisitive mind arranging the productions of nature,
investigating the qualities of bodies, or mastering the difficulties of the learned
languages. We should not care whether he was chemist, naturalist, or scholar,
because we know it to be as necessary that matter should be studied and subdued to
the use of man, as that taste should be gratified, and imagination inflamed … I say,
let us take "useful" to mean, not what is simply {164} good, but what tends to good,
or is the instrument of good; and in this sense also, Gentlemen, I will show you how
a liberal education is truly and fully a useful, though it be not a professional,
education. "Good" indeed means one thing, and "useful" means another; but I lay it
down as a principle, which will save us a great deal of anxiety, that, though the
useful is not always good, the good is always useful." John Henry Newman, p.4-5

026463 508
The comparison of religious, philosophical or scientific worldviews is a delicate endeavor, because
such worldviews start from different presuppositions and cognitive values. Clément Vidal[34] has
proposed metaphilosophical criteria for the comparison of worldviews, classifying them in three broad
categories:

 objective: objective consistency, scientificity, scope


 subjective: subjective consistency, personal utility, emotionality
 intersubjective: intersubjective consistency, collective utility, narrativity

Introduction

Methodological debates have been imminent since the birth of modern social
science. Methodological debates have been imminent since the birth of
modern social science. Historically, these controversies display a pattern of
recurrence: most of the topics debated today have appeared, disappeared and
reappeared many times since the formation of modern social science more
than hundred years ago. (Mjøset, 2009,p.40)

As mentioned in the previous chapter, beyond imagination, or purely speculative


views of needs and requirements, there are ‘real’ perceptions of use, consumption
and the user, tempered and informed through various forms of research practice. In
contemporary polemics, "dialectics" may also refer to an understanding of how we
can or should perceive the world (epistemology); an assertion that the nature of the
world outside one's perception is interconnected, contradictory, and dynamic
(ontology); or it can refer to a method of presentation of ideas and conclusions
(discourse).

In the humanities, like in the sciences, we need basic categories. Categories are
terribly important, not just for philosophers, but for everyone to engage in the
conversation. Some of these I have already touched upon, and in this chapter I shall

026463 509
discuss them more forthrightly. Method as it is expressed in this thesis exists across
three non-mutually exclusive dimensions – a discussion of the evolving methods and
procedures of this study, a discussion of the methods used to research the participants
on the trial [by the consortium of industry partners involved on the trial and with
respect to this study], and the methodological implications of contextual usability in
its relation to other methods of understanding the firm, its technology and how it
develops knowledge.

There is also discussion regarding the implementation of the research itself, and in
the case examined here, the issues that arose regarding how a researcher responds to
the difficulties of conducting research in firms under conditions of high dynamism
and commercial sensitivity.390

Each of these very much challenged, influenced and shaped the empirical research of
the present study as well as its outcomes and theorising. They also guided the
selection of literature that was reviewed. Indeed, the theoretical concerns, and the
methodological developments that culminated in the development of contextual
usability, were a product of continual iterations in both theory and methodology,
again, both shaped by the exigencies of the trial as it unfolded.

To collapse 'push' with 'pull'

The move towards user-centred design and customer-focussed marketing and


organisational philosophy would have customers involved right from the period of
ideation, through the entire development cycle of product and [re-] organisation,
even building plant and machinery, and specialist warehousing. It should stretch
across value and distribution chains, and now in the ‘networked society’ – right into
people's homes where new ‘smart’ technologies now report faults independent of

390
The environmental conditions of high dynamism and commercial sensitivity exaggerates Agar’s
(1980) notion of the “professional stranger.” - the impossibility of the researcher being ever truly
‘natural’ in the process of conducting research in the field. Presence is accented in the case of
commercial sensitivity. This is a massive problem for academics conducting research in the firm. I
found it severely neglected in the literature. It is perhaps conveniently left aside in accounts of how
things really are in pressurised commercial settings, leaving sanitised, ‘tidy’ accounts? This is rather
like in neo-positivist studies where ‘null’ results are rarely included for publication.

026463 510
user knowledge. Going well beyond Drucker's shifting of emphasis from profit to
relevance, we here beginning to understand a much more radical proposition.

Many of the problems that arise with the introduction of innovations can be
attributed to separation of and conflict between users and developers (Papanek, 1973;
Staudenmaier, 1985; Suchman, 1988). As previously explained, this level of 're-
engineering' and 're-conceptualising' business spreads across enterprise boundaries,
entire value and distribution chains, from idea to living room, mind, pocket and
activity. For some time now it has been advocated that entire firms [their partners
and distributors] should now "think like the customer." (Kanter, 1992) But is such a
goal attainable, is it practical, and if so how is it to be obtained oe implemented?

If such a goal can be attained it will only be so by interacting with customers and
users. As Poole and McPhee (1985) have it; "what we can know is determined by the
available methods for knowing." I have already mentioned frameworks used in
industrial design such as quality function deployment speak of parsing the 'voice of
the customer' with the 'ear' of the engineer (Hauser, 1988; Hauser, 1993; Hauser and
Griffin, 1993). But much of the literature on QFD pays scant attention to how
engineers, or indeed engineering firms, hope to capture the customer's voice. Much
of the methods illustrated in the literature refers to basic textbook marketing or
consumer research-based. The majority of the framework is focussed upon
engineering decision-making (i.e. Nicoll, 1999b).391 Consumer data features only at
the very beginning of what is essentially a 'waterfall' model of development, very
linear in nature, and lacking any iterative involvement of consumer-users over the
development process.392

I have already gone some way to highlighting that collapsing notions of 'push' with
'pull' with respect to new product development has already arisen in the succession of
391
QFD featured in the case outlined in the book. It was one of the techniques the firm explored in
relation incorporate into their development processes.
392
Kantor (1992) suggests the quality movement of the 1980s and early 1990s has also been criticised
for being too ‘producer-orientated’ merely focussing on reducing the costs of visible markets.
MacKay (1992) also criticises social constructivist approaches for being too narrowly focussed upon

construction of technology.

026463 511
new fads, fashions and philosophies in management (see for instance, Abrahamson,
1996). For instance Total Quality Management- (TQM) (i.e. Bounds et al., 1994) is
one such philosophy which comprises a number of quality tools and frameworks
(including QFD) aimed at orientating the entire business towards raising quality.

Anticipating use

But can the outcomes of technologies be properly anticipated, even with the most
rigorous and reliable methods of research? While we may be able to gauge how
things can be used, or suppose how they were used. The field of Archaeology largely
depends on this ability to deduce use and utility and the design equivalent of this is
reverse engineering where a product is dismantled with the object of better
understanding its functions, features and attributes (i.e. how components are placed
and seated in their case to provide of robustness).

How products will they be used I have already suggested as representing a more
difficult prospect to forecast. I have shown how it is often confounded by user
innovations, or simply by unforeseen implications of use (such as in the Pruitt-Igoe
Housing project example cited earlier). Fleck's (1993b) theory of innofusion also
outlined earlier suggests that particular technologies are best innovated through
deployment and implementation, in the sites and situations of use and operation.

His model is a kind of soft determinism (such as suggested by Pool, 1983; and
Grint and Woolgar, 1997) where a generic technology (Fleck's example is industrial
robots) acquires attributes and characteristics through addressing a range of
exigencies encountered through deployment. These include existing ways of doing
things, knowledge and expertise of the firm and so forth. The configurational nature
of technology - its ability to mutate and customise features and functions -
crystallises relative to potentials to address the exigencies of implementation (Fleck,
1993a).

Exploring and understanding the exigencies of use is what John Seely Brown cites as

026463 512
constituting some of the most important research at the Xerox Corporation (Brown,
1991). An anthropologist, Lucy Suchman began her work at Xerox by studying the
practices of accounting clerks in 1979. She found significant discrepancy between
what the clerks said they did, and what they actually did. Their verbal descriptions of
what they did correspond more or less to the formalised procedures of the job (as
written down and prescribed), whereas the actual job they did relied on a rich variety
of informal practices that were crucial to the getting the work done. Indeed, the
clerks were found to be constantly improvising and inventing new methods to deal
with unexpected problems.

The basic reason for intermediaries to act and evaluate within the space between
designers and users is to attain some sense of objectivity. Donald Norman, outlining
what I have already explored in some depth in the previous chapter sees that when
we design: "we tend to project our own rationalisation and beliefs onto the actions
and beliefs of others." (1988: p.155) Designers and producers have an obvious and
distinctive vested interest in the product they develop, and this can act towards
creating a biased (and often overly optimistic) view regarding its potentials. Their
enthusiasm for the product, or simply their intimacy with its operational premises can
blunt proper critical appraisal. Designers themselves recognise that they can; "often
know the product too well to envision how people will use it." (Dan Rosenberg,
product designer quoted in Norman, 1988: p.158) Any representation of users is
based on the designer's personal experiences and assumptions "more often than we
might think." (Walsh, 1996: p.514) Indeed, in the case of the design of a food
processor Chabaud-Rychter (1994) showed the way in which male design engineers
used (weak) models of women users based upon their mothers, wives girlfriends etc.
and how they perceived or imagined them using such appliances.

This is a clear case of how socially conveyed knowledge blends with the experience
of reality of the individual.
"Structures of formal knowledge tend to be abstract and general. They must
often neglect some details, complexity or deviations from the rule for the sake
of simplicity or conceptual clarity, but actual performance in any field
requires knowledge of how these things work together and how they affect
the principles and relationships developed in the formal system. In the area of

026463 513
technology this is particularly true because technology is always orientated
towards function, not just formal understanding. A technology must work, it
cannot simply be a good idea; it is ultimately concerned with practice, not
just theory. Any knowledge of practice (engineering, nursing etc.) always
involves a large amount of contextual knowledge, or accumulation in the
person of many specific experiences, related by common sense to form a
holistic attitude, one that is not fully analyzed and formalised." (Brach, 1991:
p.16)

Formal knowledge is knowledge that can be written down and is more easily
transferred; tacit knowledge is usually only learnable by doing and is therefore
harder to transfer. Knowledge of products may be generated in three fundamental
ways. It may be developed through appropriating data drawn from observations of
the sites and situations of design, production, distribution, consumption or use. It
may also be drawn from the perceptions of those that have designed, produced,
distributed, consumed or used the product. It may also arise from social events such
as regulatory and standard committees, or even from legal cases.

Suchman (as in Suchman, 1988, 1995) went on to have a very distinctive influence
upon the thinking in computing and artificial intelligence by edifying use contexts in
the realms of situated and distributed cognition, (others making contributions in this
area include Zuboff, 1984; Norman, 1991; Carroll and Rosson, 1992)

Ethnographic turn

Carroll (1997) argues that work such as Suchman's should be viewed as occurring
within a much larger paradigmatic restructuring occurring within social and
behavioural sciences at this time – what has been referred to as the 'ethnographic
turn' (Moores, 1993; and also Knorr-Cetina, 1981 with relation to social studies of
science).393 It was turning entrenched views that aimed at studying individuals
independent from the contexts and conditions in which they operated, worked,
experienced and lived. What Guba (1990) calls this the paradigm dialog, or what is
more widely understood as the interpretative paradigm, is a research emphasis upon

393
Interpretism originated from literary theory and hermeneutics. What is interesting to note is that
pundits of literary theory who have had their "linguistic" and their "interpretive" turns now appear to
turn their sights upon a "historical" one (McDonald, 1996).

026463 514
context and a challenge to those methods of research arising from the tenets of
scientific positivism and reductionism. These for some time reigned as the dominant
method for studies where people were involved. Instead interpretative, ethnographic
studies, conducted in the ‘field’ - office, factory floor, people's homes - were
replacing lab-based experimental methods of discerning the qualities and attributes
of products, and of people, and of their interaction. As in the work of Suchman
(1983) these new studies were drawing attention to neglected, tacit, implicit
influences in what people did with technologies, media, products, designs and with
other people in their everyday working and domestic lives. Such approaches sought
to explore and conduct inquiry in as naturalistic setting as possible (Erlandson et al.,
1993).

'Getting closer' to customers is not exclusive to emerging high-tech markets. It also


features in relatively mature markets. Pragmatic approaches arise in new types of
experiential marketing techniques for more mature products. For instance, Lauglaug
(1993) outlines Technical Market Research, which uses 'antennae shops' set up in
shopping malls where firms can interact directly with, and get feedback from,
consumers (his example was a tire manufacturer).

McKenna (1995) refers to a similar process in the case of Philips NV sending


multidisciplinary teams of designers and social scientists by mobile vans into
communities across Europe. The aim was to enter into a dialogue with consumers
regarding new designs, where design specialists and customers "interactively
imagined new possibilities." (p.88) McKenna sees that through these techniques;
"companies can foster interest in the [new] product before it reaches the market.
Customer acceptance of a new product is part of the process of developing it." (p.92)
Closing the space between designers and users is now a preoccupation of business
and industry, and since the early 90s there has been a trend towards an increased use
of social scientists in the innovation process (Cove and Svanfeldt, 1992).

Methods of getting 'closer' to consumer-users are now widespread and practised, all
be it in different ways, in different organisations. For instance the Intel Corporation

026463 515
have an anthropological laboratory which is sited in their main production facility,
the object being that they can study the everyday use of things and work out
directions to move in the future. John Thackara (1998) also points to Sony's
enrolment of cultural anthropologists to brief software engineers, and Sharp
employing sociologists to study the everyday routines of people. The object is to
discover and fill any 'gaps' they discover. While 'gaps' may be limited in the case of
discrete, simple and mature technologies, they become more obvious with the advent
of 'smart' or 'intelligent' networked products. The significant plasticity in their
particular functions and configurations, and with their distinctive reliance upon a
wider spectrum of human factors and temporal relevancies, bring an altogether new
dependency upon frameworks that generate valuable design knowledge.

Interpretative Paradigm

Most methods of 'getting closer' to customers have their origins in "interpretative


social science", which over the last half century, has risen as a research paradigm that
aims to break out of the constraints imposed by positivism in understanding
particularly the social world. Since the 1970's, a shift has occurred in open system's
research from individualist, formal cognitive models to models situated in and
grappling with, complex real world dynamics (Hutchins, 1995). The term
"interpretative social science", which includes many of the qualitative research
theories and methods, represents a very large spectrum of philosophical, empirical
and methodological orientations.394 Its emphasis is upon the relationship between
socially-engendered concept formation and language. In previous chapters I have
continually drawn attention to the importance of language, in particular, Wittgenstein
(1953), Berger and Luckman, 1966; Winograd and Flores, 1988; Schein,1985;
Mohan, 1993; Arnold and Fischer, 1994 with respect to the primacy of language as a
pivotal structuring element in the organisation of reality, and Spradley (1979) when

394
It is a generic term covering many different approaches to inquiry: The verstehen tradition of
Dilthey, Richert and Weber; the phenomenological psychology of Brentano, Stumph, Külpe, Husserl,
Heidegger, and Shutz transformed into the ethnomethodology by Garfinkel, Sacks and Schegloff; the
symbolic interactionisim of Mead, Cooley, Blumer and Goffman among others; the followers of Ryle
and the later Wittgenstein who emphasised ordinary language analysis, speech acts, accounts and
justifications; the ethogenics of Rom Harré; the dramatism of Kenneth Burke; the ethnographies of
Clifford Geertz, constructivist and naturalistic inquiry of Lincoln and Guba, and so on.

026463 516
he suggests that much of culture is encoded in linguistic form. And most importantly
Ricoeur (1977) noted, we begin to unpack the meaningfulness of lived experiences
by presupposing that it is as purposeful as a written text.

Containing such qualitative methodological approaches as phenomenology,


ethnography, and hermeneutics, studies falling under the category of the
interpretative paradigm are characterised by a belief in a socially constructed,
subjectively-based reality one that is influenced by culture and history: "language is
the universal medium in which understanding occurs." (Gadamer, 1979: p.389)
Language forms the basis for social learning, either by social and human researchers,
and most importantly by people between and within themselves.

Geertz, (1973: p.5) construes interpretative anthropology; "not as an experimental


science in search of law but an interpretative one in search of meaning," and Taylor
(1977) stipulated that interpretative sciences must deal "with one or another of the
confusingly interrelated forms of meaning." (p.101) Through language, experience is
filtered, encoded, and communicated in dialogue. It bridges past and present,
interpreter and text: it conveys and propels tradition." (Arnold and Fischer, 1994:
p.58) Many fields such as empirical studies of usability, audience research and
studies of consumption have embraced the views and practices of the interpretative
forms of research. This is because of its distinction pragmatically from neopositivist
approaches, as well as its epistemology. Concentration is placed upon a research
subject's perceptions of the qualities of a phenomenon, rather than that of the
researcher. Qualities in this case are closely associated with the elicitation of
'meanings' - the sense that particular phenomenon have for individuals, and their
ability to convey and communicate that sense:
"There is a move away from obtaining knowledge primarily through external
observation and experimental manipulation of human subjects, towards an
understanding by means of conversations with the human beings to be
understood. The subjects not only answer questions prepared by an expert,
but themselves formulate in a dialogue their own conceptions of their lived
world." (Kvale, 1996: p11)

There is considerable disagreement among advocates of interpretation about whether

026463 517
inquiry should focus on actors' accounts of their own meanings, theorists' readings of
actors' meanings, theorist's interpretations of actors' accounts of their meanings and
so on. Menzal (1978: p.165) criticises glossing over of difficulties by researcher's
searching for meanings. As he sees it the procedures for understanding subject's
meanings "require great efforts and long periods of time."

Human ideas, experiences and intentions are not objective things like molecules and
atoms. Nevertheless, just like some of their colleagues in the Natural Sciences, many
social scientists attempt to use "objective" methods that allow for the control,
predictability and "generalisability" needed to uncover the "laws" or "patterns" that
guide human behaviour and the systems in which that behaviour occurs. The
scientific method constructed to do this has long been claimed to be a value free tool
of inquiry, allowing many social scientists to create a separation between themselves,
their methods and their research. This separation is a dangerous one, for it gives
scientists a false authority of truth that stems from the claims that there is an
objective way to study an "objectifiable" world and that the only way to study this
world is through rigorous application of the scientific method.

Ethnography

The approach which aimed to research the trialists in this study was developed
largely in response to a further piece of ethnographic work - Silverstone, Hirsch and
Morley's (1991) pioneering ethnography of the use of media technologies in London
households. What was imporant about this work was that they drew attention to the
complex of social and psychological contexts that dictate domestic ICT use in
advanced societies, and the methodological complexities arising from the study of
such domestic spaces.

Ethno - folk and graphy - description originally implied the means by which an
anthropologist documented some distant and alien culture's customs and beliefs (i.e.
Malinowski, 1922/1978). They would go 'into the fields' observe, draw inferences,
and write for an interested audience. It has recently been employed in more

026463 518
proximate settings such as the micro-social environments of nursing (Morse, 1994),
television audiences (e.g. Lull, 1988; Morley, 1989; Moores, 1993) and children's
computing (Turkle, 1984). The objective of ethnography keeps with the general
principles of interpretist social science in that it seeks to apprehend the social
environment from the subjects' point of view. But can also include and incorporate
data gathered through other methods including experimentation and experimental
groups (Pelto and Pelto, 1978). Silverman (1985) also notes that ethnographers can
also use census and statistical procedures to analyse patterns or to determine who or
what to sample. As such it can precede or follow neo-positivist studies.

Ethnography as originally practised, had the researcher spending protracted periods


of time in the field. This would clearly be impractical for projects that involve the
development and testing of new products, such as the deployment of interactive
television. Indeed, one could request, and even be granted access, to live in a family
home and to follow their everyday domestic routines for six months, including how
they incorporated i-Tv into daily life. But such a prospect is clearly untenable for a
number of practical reasons. With respect to the present study some of the sampled
households did not 'switch on' the i-Tv system on for many weeks at a time. There
were large temporal gaps between trialists ‘getting connected’. While providing
insights on a particular family's existence, such a study would not have much to say
on i-Tv, that is there would be little comment upon their interaction. No method is
perfect, as all are to some degree invasive acts within the normal day to day flow and
nature of activities. Naturalistic behaviours, expression of views etc., will most
certainly be impacted upon by the continuing presence of a researcher whose very
presence will change that under investigation. Such practice is also, to varying
degrees, an interpretative procedure, which come under the discretion of the
researcher's bias, ideological influences and worldview.

Similarly, it is not often possible, nor desirable, to spend protracted periods of time in
a firm. Unless one secures good trust, observing those working can again stilt natural
behaviour. In the case illustrated later there was considerable difficulty, due to time
constraints and development pressure to get hold of engineers to interview (certainly

026463 519
compared with managers). They were a scarce commodity. Also, the researcher’s
gaze cannot be everywhere at once, and so one can only truly derive a fragmented
view of 'what is going on', a particular configuration of reality.

These problems have brought about a different kind of ethnographic research than
that which suggests protracted periods of time spent in the field. Examples here are
the ethnographic interview (Spradley, 1979), or the long interview (McCraken,
1988), or the long conversation (Silverstone, et al., 1991). Some commentators on
'real world' research go so far as see little difference between ethnography and other
interpretive styles of research such as naturalistic inquiry and case study interviews
(Robson, 1993).

Yin (1989) notes that a case study investigates a contemporary phenomenon in its
real-life context. This is where the boundaries between phenomenon and context are
not clearly evident, and in which multiple sources of evidence are used. Mjøset
likens case study methods in social science to the work of courts and social workers:
“The common feature is that we isolate sequences of events towards an outcome as a
case because we have an interest in the outcome and thus also in the process. In
everyday life, the importance of the case might be quite personal; we make cases out
of particularly important chains of events in individual life histories.” (Mjøset, 2009,
p.46)

Perhaps most relevant to the present study Yin goes on to suggest that case studies
are the preferred strategy when 'how' and 'why' questions are being posed, and when
the investigator has little control over events. Yin (1989: pp. 10-11) describes,
"Ethnographies usually require long periods of time in the 'field' and emphasize
detailed, observational evidence . . . In contrast, case studies are a form of enquiry
that does not depend solely on ethnographic or participant-observer data." These
approaches may sometimes represent something more of a 'snatch and grab' or
'guerrilla' style of research activity, even investigative in the style of police inquiry.
But in the dynamic environment of high-technology this is often the best one can do.

026463 520
It is simply not possible in many cases to go as 'deep' as one may wish. And as was
the case in the present study one must use a 'bricolage' (i.e. Levi-Strauss, 1968) of
information, interviews, e-mails, informal chats, new reports, company literature and
other sources to build a picture of what is going on.395

.. There is no best way to tell a story about society. Many genres, many
methods, many formats – they can all do the trick. Instead of ideal ways to do
it, the world gives us possibilities among which we choose. Every way of
telling the story of a society does some of the job superbly but other parts not
so well…… (Becker, 2007: p.285)396

The object of ethnography is similar to many other interpretist approaches, such as


hermeneutics, phenomenology, and so on. One looks for recurring 'themes' which
appear to be shared between members of the classification: "every particular is also a
sample of a larger class. In this sense, what has been learned about a particular can
have relevance for the class to which it belongs. The theme embedded in the
particular situation, extends beyond the situation itself." (Eisner, 1991: p.103)
Through highlighting these themes one can sample something of the view of the
'interpretative community' regarding some particular phenomena.

Such studies can differ immensely in approach and intensity but they do represent a
commentary upon method. Positivist and reductionist studies can deconstruct already
reasonably understood and controllable phenomena, where design and hypothesis
predicts outcome and relations, but where things are 'messy', difficult, and little
understood or controllable one must adopt a more open, adaptable and receptive
approach.
"These [ethnographic] approaches are holistic in emphasis and are
fundamentally concerned with the context of actions: thus, the argument runs,
an action such as the viewing of television needs to be understood within the
structure and dynamics of the domestic process of consumption of which it is
395
This keeps with much of the thinking arising from the social constructivist studies of technology, a
good example being Latour’s Aramis -the Love of Technology (1996). Its contents were derived from
extensive interviews with the key people in government and industry that were involved in this
research and development effort. A great many documents were also reviewed and assessed. This
showed that what the technology ‘was’ varied according to the perceptions of the various interest
groups.
396
Becker, H.S. (2007) Telling About Society Chicago: Univ., of Chicago Press

026463 521
a part...television watching is, in fact, a very complex activity, which is
inevitably enmeshed with a range of other domestic practices and can only be
understood in this context." (Morley, 1992: p.173)

Clifford Geertz (e.g. 1973) regarded as a modern authority on ethnography,


introduced the term 'thickness' of description as a key concept in ethnography.
Derived from Gilbert Ryle (1949) this suggests that the more detail that is outlined
into an ethnographic text, the more potential there is for a fuller, multidimensional
'reading' by the reader. In Works and lives: The Anthropologist as Author Geertz
(1988) acknowledges the role of rhetoric: " . . . their capacity to convince us that
what they say is a result of their having . . . truly "been there." (pp.4-5) But access,
and 'reason for being there' is a definitive problem.
'The way in which the participant-observer gains access to the field setting,
the problems and refusals encountered there, the shifts in their interpretations
as time goes by and they become more "experienced", are all important to
understanding the researchers account. They specify the relationships that
underlie and "contextualise" the researchers findings' (Poole and McPhee,
1985: p.128)

It is commonplace in the literature of ethnographies, case studies, ethnomethodology


action research and participant observation that 'observers' can take on a number of
different roles, each differing in terms of "distance" from the subject of study. Gold
(1958) distinguishes four of these roles: The complete observer, the observer-as-
participant, the participant-as-observer and, and the complete participant. This
range of roles has an implicit, but often neglected feature - they all assume that the
researcher has a definite theoretical model that guides the selection of the role and
how the role is put into action in the research situation.

The 'complete observer', while perhaps epitomising the open approach to research, is
often not practical or even desirable in the area of media ethnography, or
organisational studies. Most researchers conducting ethnographic research have a
theory of what is important to look for in the research situation. This limits the types
and forms of explanatory concepts or mechanisms that are drawn out 'inductively' or
'retroductively' or by means of 'situationally guided hypotheses'. Put simply the focus
of the research, although making coherent the 'chaos' of the real world, limits

026463 522
perspective on what may be an important agent or factor in the creation of a
phenomena or in a particular thought or act. I am saying this while acknowledging as
Mjøset does “that A pure description of the flow of events is impossible, and thus
‘unconscious’principles of selection must be involved in any description.” (Mjøset,
2009, p.55)

Agar (1980) notes that ethnography is both a process and a product and goes on to
state emphatically: "Without science, we lose our credibility. Without humanity, we
lose our ability to understand others (p.13)." Again it is the researcher who becomes
a catalyst for the research, as opposed to removing themselves from the research
process as in neo-postivist studies. As in phenomenology, ethnography has an
intrinsic reflexive character, which implies that the researcher is part of the world she
or he studies and is affected by it (as well as effecting various contingencies of that
world). As Atkinson (1983: p.14) has it, both extreme positivism and naturalism;
" . . . assume that it is possible, in principle at least, to isolate a body of data
uncontaminated by the researcher, either by turning him or her into an
automaton or by making him or her a neutral vessel of cultural experience."

Therefore, the observer and the system co-evolve together. This mean that the
observer can see himself as part of the system under examination. Each player in a
musical ensemble, for example, listens to each other player, and to his, or her, own
instrument. [theird order ceyberetics]

Charmaz (1995) notes that the field of sociology carried a strong tradition of
ethnographic research from its roots until the 1960s, when it lost favour to the rise of
sophisticated quantitative methods and the adoption of the logico-deductive model of
research. She sees this period as one that lacked new theory construction, as under
the logico-deductive approach studies were informed by some pre-existent theory,
and the subsequent establishment and proving of hypothesis were inevitably based
upon this. Such a view then delimited innovation in theory.

Glaser and Strauss (1967) were among the first to challenge this, through their
grounded theory approach. This aimed to generate new theory by bridging

026463 523
interpretative analysis with positivist assumptions. But even here there were dangers.
Glaser and Strauss(1967,p.34), is in most instances the result of:

… believing that formal theories can be applied directly to a substantive area,


and will supply most or all of the necessary concepts and hypotheses. The
consequence is often a forcing of data, as well as a neglect of relevant
concepts and hypotheses that may emerge.
Grounded theory was largely influenced by the pragmatic philosophical tradition
with its emphasis on studying process, action, and meaning, and particularly the
symbolic interactionism of Mead (1932, 1934, 1936, 1938) and Blumer (1969) and
others coming from the Chicago legacy of ethnographic research.

Glaser and Strauss (1967) argued that grounded theory methods cut across
disciplines, and it significantly contributes to the foundations of the 'ethnographic
turn' (Moores, 1993) or the 'interpretative turn' (Geertz, 1973). It is also draws us
closer to studies of context:

“If we trace a causal chain we need to include the context, and thus, we have
substantive grounded theory, the earliest definition. If we recognize patterns
in interaction across ‘conditions’(contexts), we have formal grounded theory,
the most recent definition, in which conditions are unknown, i.e.
unspecified.”(Mjøset, 2009, p.57)

This was the paradigm shift that they view has emerged in social science research
since the 1970s. While he refers specifically to audience and media research,
scholars coming from a wide range of disciplines have also registered this move to
ethnography and ethnographic-style of research – for instance Maurice Holbrook in
his discussion of evolving perspectives in the area of consumer research
(Holbrook,1995). The shift from neopositivistic and hypothetico-deductive
methodologies, and "managerially relevant studies," was based on realisation that
Holbrook and others;
" . . . had turned to an emphasis on the emotional aspects of consumption
experiences, our style of doing consumer research had moved fairly far away
from the traditional preoccupation with making discoveries for marketing
managers." (p.14)

026463 524
This proposition is of interest to the case study presented here and to the larger

arguments regarding the interpretative paradigm. The reasons for, and the product of,

research – its ‘use’ – often denotes and delimits the available learning. That is, it

[consciously or by default] narrows the window upon what one can discover.

Interestingly enough, considering this proposition one can begin to see harmonics

with descriptions of technology. As a device ‘opens some options’ and ‘closes

others’, so method, its employment and purpose opens some learning and closes

others.

Ethnography in HCI research

In the field of HCI research there was a lag of some ten years before researchers
began to adopt a gradual focus on the contextual and cognitive determinants of use
(Carroll, 1989; Good, 1989; Maskery & Meads, 1992). The origins of this may be
traced to writers in the 1970s, such as Hubert Dreyfus (1979) and Joseph
Weizenbaum (1976). They assembled philosophical arguments about whether human
beings could ever be replicated by machines. They claimed a unique cluster of
human attributes focused on the drive from artificial intelligence to replicate and
model human intelligence. This raised the question: what is human about human
beings? Close examinations of machine functioning in real world settings saw
computers as rigid, often agents of power and bureaucracy but not sentient in their
own right.

Such questions developed in the fields of artificial intelligence (AI) and also with
respect to the operational relations between humans and computers. A number of
investigators began to argue that interactions between people and computers is too
complex for overarching theories (for instance, Landauer, 1991), and others began to
argue that positivism, and the experimental basis of much HCI research, was largely
irrelevant to progress in design (Carroll et al., 1991; Pylyshyn, 1991). Around this
time, computer scientists began to express interest in ethnographic studies of practice
of computer use. Such studies continually focused on human creativity and the local
nature workplace exigencies, making universal, formal, rational systems a seemingly

026463 525
impossible goal (Star, 1989). Included here should also be Turkle's (1984) pioneering
ethnographic study of children's interpretation of computers; Suchman's (1988)
studies of situated plans and actions; and Winograd and Flores (1989) exploration of
the nature of computers and design. Also Forsythe's (1992) ethnographic work on
artificial intelligence research and Gasser's (1986) study of people struggling with
standardized systems at work, emphasised strongly how people always "work
around" the rigidities or remoteness of computing from human experience. Each of
these studies conveyed a sense of the limits of computers, of formal modeling, and of
rationalisation.

Carroll (1997) notes that by 1990 there was a clear consensus that the cognitive
modelling approach had failed to provide a comprehensive view of human-machine
interaction, and a "more socially or organizationally oriented approach was required
to supplement or replace the cognitive paradigm." (p.510) 'Ethnographically
informed design' (Bently et al., 1992) rose to prominence in this climate. The late
1980s and early 1990s saw the gradual diffusion of more interpretative,
ethnographic-style positions in HCI research (most notably, Nardi, 1992; Holtzblatt
and Jones, 1992; Whiteside and Wixon (1987b); Whiteside et al., 1988). Each has
significantly developed thinking and practice.

However, this subtle shift in perspective had been slow to take hold in development
communities. Only recently has the contextual perspective and its implications begun
to penetrate development citadels (Holtzblatt & Byer, 1995; Holtzblatt & Jones,
1992; Hughes & Randall, 1992; Simonsen & Kensing, 1994, 1997, 1998;
Simonsen,1996). As the competition in today's computer industry grows fierce,
software manufacturers have recognised the need to bring technology to users not
only in terms and concepts that are easy to use but with a fundamental design derived
from an intrinsic awareness of the user's work practice and cognitive models
(Grudin, 1993).

Most notably the development of contextual inquiry (Raven and Wixon, 1994) and
contextual design (Wixon, Holztblatt, and Knox, 1990; Holtzblatt & Beyer, 1997).

026463 526
Contextual Inquiry (CI) is a process first pioneered by Karen Holtzblatt in the early
1980s. CI provides a way to collect data on users in their environment adapted from
ethnographic research methods. Pragmatically orientated to industry, it is intended to
fit the time and resource constraints of engineering environments (Holtzblatt &
Beyer, 1997).

CI enables the researcher to gather data from real tasks in the workplace. The basic
premise in the interview is to develop a detailed work model that is true to the user.
This model is generated through a series of observations and questions directed on
the immediate tasks. Contextual design was intended to addresses a number of the
inadequacies in previous methods by emphasising: interview methods conducted in
the context of the user's work, co-designing with the user, building an understanding
of work in context, and summarising conclusions throughout the research:
"Context tells us to get as close as possible to the ideal situation of being
physically present. Staying in context enables us to gather ongoing
experience rather summary experience, and concrete data rather than abstract
data."(Holtzbaltt and Beyer; 1990; p.47)397

The contextual method is a formal method of field research and data modelling that
begins with the observation of users within the context of the workplace. Users are to
be studied in their normal working context, as they execute ordinary work
assignments. Experimenters following the Contextual Inquiry technique observe
users working and record both how the users work and the experimenters' interaction
with the users. The contextual method is rooted in ethnographic or naturalistic
research in which the researcher becomes part of the subject's environment in order
to observe spontaneous, natural behaviours and interactions (Diesing, 1971;
Jacobson, 1991; North, 1987).

Research within the Cambridge Trial

397
Holtzbaltt and Beyer define summary experience as the explanatory characteristic of summarising
or shortening accounts of an experience. This contrasts with ongoing experience where being with the
person as they operate or perform tasks shows the complexity of what one actually does in detail.
Similarly concrete data differs from abstract data in the way it offers detailed accounts of behaviour
as unique events rather than simplified routines.

026463 527
Both quantitative and qualitative approaches to studying the trialists featured in the
case of the Cambridge Trial. The present study of the Cambridge Trial began in a
quantitative style, and was intended to use usability tests and survey schedules.
These would be subjected to quantitative analysis. However, under the conditions of
rising complexity it gave way to a study that became entirely interpretative in nature,
consisting largely of open-ended case study style interviews with company personnel
and later trialists in their homes. (I explore this in more depth in Chapter 7.)
Interviews with company personnel provided the material of chapters 5 and 6, while
the interviews with the trialists are outlined in appendix 1 – The Cambridge Trialists.
A number of reports also fed back into the study that was conducted within the
Cambridge Trial (i.e. conducted by, and on behalf of, the working group on user
research).

This was the research of trialists, designated first by Om (working in collaboration


with myself), and then subsequently under the jurisdiction of the working group
responsible for user research. This working group consisted of representatives of the
principal service providers (PSPs) and myself. They decided upon a multi-method
approach including the use of surveys, online questionnaires and system-logging. As
such it was a hybrid of quantitative, positivist and qualitative, interpretative
approaches.

System-logging

In opening this chapter it was alluded to that there has arisen a kind of bias in private
sector social research towards quantitative approaches. One area where this has been
particularly prominent has been audience research. By the 1950s with the advent of
television, elements of media research spun-out from academia to become a highly
profitable sector in its own right. The new private sector research organisations
became powerful intermediaries acting between television stations and advertisers.
They promised to certify the power of certain channels and programmes in their
ability to draw audience, and therefore to serve as potent platforms for
advertisements. Such circumstances also came to prevent funding opportunities for
academic research from either media institutions or government. The commercial

026463 528
organisations were wary of academic reformism in research practice which still
remained strong (even under conditions of bad funding) and; "at least by implication
critical of the [private sector's] media practices." (McLeod et al., 1991: p.240) This
predominance of quantitative methods of research activity, most commonly known
as 'ratings' research, remains the major means by which programme schedulers
decide which programmes remain and which are dropped, as well as negotiate
revenues from advertising agencies. In negotiation of the potency of various
advertising slots there is no more powerful argument than statistically and
scientifically proven efficiency (this with respect to an industry sector that focuses
entirely upon the creation and manipulation of symbols).

A new vista in tracking both use and consumption has been opened by digital
communications systems such as the Internet and i-Tv. Any change in the internal
state of a digital system can be registered. Following similar lines as those found in
behaviourism, and in particular participant observation, are digitally-enabled
approaches aimed at mapping the frequency and duration of events of similar nature
for the purposes of registration and calculation. Beyond registering and defining
autonomous internal changes of the system are those initiated by the user via the
interface. The interface allows passage of stimulus from extraneous sources (i.e. the
environment via sensors, or the human world in terms of intentional button pressing)
to enter the realm of the technical system where it changes its state. Registering these
changes of state, can serve to monitor and provide a window on incidences of use.
By adding other basic dimension defining digital systems – the ability to store and
process data – individual acts of use (or changes in states of the system) can be
linked, sequenced and correlated statistically leading to the development of
inferences regarding patterns of use (usage).

Networked media technologies, whether a computer/modem or cable/satellite based


interactive television system provide ways by which 'interested others' may 'people
meter'. This is one of a range of user-consumer data that may be gathered via a
digital communications system. Other techniques include online questionnaires, the
use of vignette- style quizzes, and direct communication one-to-one online. A

026463 529
domestic i-Tv system through logging on and accessing its server, can provide
information regarding the length of use of the entire session and of its component
parts - i.e. the proportion of session time spent on games opposed to news etc. It can
provide data concerning the interaction style of particular user households (high
interaction, low interaction styles), which may indicate which services are most
frequented within the household and so on (see below).

Table 4.1 Sys-log and its data


Activity Data Generated
Television on-off times When? How Often?
Shows and advertisements watched Which were viewed in part, which were
watched in their entirety? Did viewers hit
'tell me more' for further information?
Electronic Programme Guide (EPG) Which lists and which features were
explored? How did this compare with what
was subsequently watched?
Items purchased online What? From who? From where (i.e. which
ads) Completed purchases against
attempted purchases.
Services used Which, how often?
Forms and surveys Viewers requests, quizzes, etc. Which
incentives drives responses?
Games Which games, format, genre etc?

When the above data is combined with the information which is provided on sign up
(which may include income, profession, number of children, newspapers taken, other
products owned, hobbies and pursuits and so on), 'interested others' have significant
insight into household constitution with respect to tastes, preferences and
consumption.

However, it was technically uncertain whether individual users would have to logon
separately into the Cambridge system, or to different services. In the case that this is
made possible (as indeed it is with personal digital assistants and mobile
communicators) there may be some potential for asserting use and usage at the
individual level. Until then use will be at the household level of analysis (which will
need augmenting with ethnographic style research to allow for reasonable
assumptions to be drawn regarding service/content preference at the individual

026463 530
household member level). An example of the sys-log data as used in the Cambridge
system is provided in appendix 3.

Tracking of system use has formed a major part in experimental style usability
testing and HCI since the advent of dumb terminals connected to mainframe
computers. Quantitative measuring of usability have set the definition of what is
meant by "ease of learning" and "ease of use" (Dumas and Redish, 1994). Dumas and
Redish recommend 'planning your observations' for the following major reasons:
"... in a typical test, a great deal is happening very fast. If you don't think
about what you want to focus on , if you just go into the test "to find out how
the product is doing," you may miss critical information that you need to
understand how to fix the problems that people are having."

Also:

"you need accurate counts of relevant measures to know how serious


participants' problems with the product are. You need the numbers to
convince managers and developers that the problems are real, serious, and
need to be fixed." (p.184)

Lab-based usability testing usually involved the setting of particular tasks, whose
completion was monitored by the tester in terms of 'performance measures'. 398 The
recommendation is the use of quantitative 'performance' measures to:
1) Focus and plan observations to locate 'critical' information.
2) Develop relevant measures by which to evaluate the 'seriousness' of
problems.
3) Provide 'convincing' evidence which will provide justification of your
observations.

There were also entire ranges of performance measures based on a test participant's
emotional state while experiencing problems and faultfinding (whether they
experienced frustration, confusion or satisfaction). There would also be a range of
measures with respect to manual help, or calls to a help-desk for aid. For the most
398
Dumas and Redish (1994) cite the following examples of performance measures in usability testing
usually included the following:
 time to finish a task
 time spent navigating menus
 time spent in the on-line help
 number of wrong menu choices
 number of wrong icon choices
 number of times online help was accessed

026463 531
part most of these are irrelevant, or a least much more difficult to isolate to any
meaningful degree when tackling the informationally rich context of home-based
entertainment orientated systems.

Registering how much time people take say to, access a certain page, restart a video
game, return to a 'home' page, navigate through various screens, how many times
they 'zapp' or 'zip' through a video, how much time they spend deciding on
purchasing goods or services, how many times they returned to a page illustrating a
good or service before purchase and so on, allows a much wider spectrum of use data
than was ever before possible via 'people meters' or traditional market research
techniques or by usability testing. Implicit within the use of the system are
behavioural data that are equally valid for researching on-line consumer research as
they are for evaluating usability.

What sys-log files capture is very much akin to what was available to those usability
researchers who went beyond simple observation into the tabulating of keystroke
information gathered through use of a system. This information once gathered can be
used to tabulate occurrences of button presses on the remote control and then analyse
their sequences. With respect to the sys-log data derived from the Cambridge trial,
there remains the problem of who in the household was using the system at any one
particular time. Access codes were issued at the level of the household, and these
were often shared between, for instance, family members. This meant that any usage
data drawn from the system could only show the usage of that particular STB and not
the usage of the system by any one particular person. The company discussed the
matter of supervisor user access codes in the early stages of development. The main
driver was to restrict access to adult material by children in the household. However,
due to the situated use of the STB in shared areas of the house this was looked upon
as non-feasible. The belief was that it was impractical to implement a system which
required a person wishing to log on to a particular service, would have to request that
another would have to log off. This raises concern that people would find such a
feature cumbersome, and would probably ignore its use anyway, preferring instead to
log on once and let others use the system as it stood.

026463 532
As will be illustrated in the later chapter outlining the experience and process of the
research, the production of quantitative sys-log data seemed to be privileged over
qualitative data. This may have been due to a number of reasons. It may also have
been due to the institutional bias of the particular department at NOP Research, who
had taken the chairmanship of the user research group, and whose main media
research approach was large-scale survey work and aggregate statistics. A desire for
the group to explore the parameters of what sys-log could do as an innovation in
itself.

There is a definite seduction in system logging as a means of automated marketing


and consumer research. Turnstile counters at sports grounds, counters tracking traffic
across a bridge, or counters detailing the production of widgets from a production
line show the obvious benefits in the automated registration of quantities. A further
reason already referred to earlier, is the way in which quantitative data and their
subsequent communication by means of figures and graphs can often appear 'more
scientific' and carry more persuasive power than the 'stories' and 'scenarios' of
qualitative research (Holbrook, 1997).

Nevertheless knowledge derived from system-logging has led some commentators


such as Regis McKenna (1995) to suggest that a new era of personalistion is to
emerge which; "harkens back to the days when the butcher, baker, and candlestick
maker knew their clientele personally . . . A cabinet maker designed and
manufactured by interacting constantly with the customers. The customer got
immediate responses and assurances.” (p.90)

One of the problems with sys-log in various systems – credit cards, interactive
television, supermarket transactions and reward cards, banks - is that it produces a
massive amount of data, which leaves many organisations at a loss regarding how to
process this.399 Today, companies such as Facebook and Google have it that the
consumer is the product, with their personal information to be gleaned and sold to the
399
This is not only a problem for new media. Apparently it is a common problem for VISA
transactions and Reward Card data as well.

026463 533
real buyer: marketing and sales organizations trying to sell to those consumers.
However, at this time, there was a distinctive lack of proper techniques to analyse,
package and sell massive amounts of data (something in the region of 5 terabytes -
1,000 gigabytes, or one million million bytes - of anticipated data a year). A suite of
techniques, collectively known as data mining, offers one possible solution. Data
mining uses machine learning and data visualisation techniques to facilitate and
automate pattern recognition in large data collections. However datamining is an
immature, unstable market today, with the technology and its understanding needing
perfected by business analysts.400 80% of a datamining process involves resource
intensive data pre-processing such as the Gartner group's five-stage model.401

Consensual approaches to research

"A computer system does not itself elucidate the motivations that initiated its
design, the user requirements it was intend to address, the discussion, debates
and negotiations that determined its organization, the reasons for its particular
features, the reasons against features it does not have, the weighing of
tradeoffs, and so forth. Such information can be critical to the variety of
stakeholders in a design process: customers, services and marketers, as well
as designers, who want to build upon the system and the idea it embodies.
This information comprises the design rationale of the system." (Carroll,
1997: pp.509-510)

One final point relevant to methodology, which can be understood within the context
of the discussions raised in chapter 2 regarding the complexities of social
constructionism applied to technology development - is how a research approach can
be produced and implemented as a product of consensus effort. Each firm in a
consortium can have very different views of who the researched are, and what they
would like to know from them.

400
Erick Brethenoux, research director for advanced technologies at the Gartner Group, Paris (as
quoted in Gerber, 1996)
401
The Five-Stage Process of Implementing Datamining (ibid.):

1.Select and prepare the data to be mined.


2.Qualify the data via cluster and feature analysis. Clustering and segmenting data
reduces complexity in order to manage the data to be analysed.
3.Select one or more datamining tools.
4.Apply the datamining tool to achieve knowledge discovery.
5.Apply the knowledge discovered to the company's specific line of business to achieve a business
goal (i.e. to make more money).

026463 534
This raises substantial problems in producing a relevant and coherent research
approach, again with dangers of 'overloading' research participants and presenting
them with too much incoherent research approaches (again this is something which is
dealt with in more depth later in Chapter 7). This is an important aspect which lies at
the core of the CU approach - that it is important to understand use from the design
perspective, as much as from the distribution and end-user-consumer perspectives.
All these contribute aspects of what the product essentially 'is'.

There were two most obvious stakeholders with respect to knowledge concerning the
usability of the system. First were the hardware developers. Next were those
developing the content material. The usability of the remote control was only relative
to the usability of the on-screen controls and nature of the content material. This
suggests that usability research on one aspect of the technological system - in this
case the remote control - was dependent on the various characteristics, attributes,
features and functionalities of other parts of the system - i.e. the nature of the content
material. This will, in turn, have implications on the choice of method, as well as
separating out data most relevant to those involved with hardware or content design,
as well as perhaps some general design rules governing how they interrelate and
combine in the interfaces.

Chapter discussion

In the 'information' age, there is a real value placed upon knowledge. Knowledge has
been to a large extent reified in many discussions during the 1990s, both in academia
and in the private sector, regarding what it is, how it should be identified, stored, and
applied. But it remains that competitive advantage is being built on the ability to
constantly collect new information and process it meaningfully, i.e., to digest it and
create new knowledge.

This new knowledge must then be transmitted throughout the firm (Imai et al, 1985;
Takeuchi and Nonaka, 1986; Nokata, 1991; Nonaka and Takeuchi, 1995; Leonard-
Barton, 1995; Spender and Grant, 1996; Sweeny, 1996; Teece et al., 1997). Amongst

026463 535
the various kinds of knowledge flows within and without the firm one remains of
clear value - the ability to know one's customer and what they might do with one's
product and service.

In proposing a set of principles – i.e. contextual usability and sociotechnical


constituencies – for conducting and evaluating interpretative field studies proponents
of interpretive approaches may view a violation of the emergent nature of
interpretive research, while others may think just the opposite. Within this debate I
adopt a middle stance, and what may be considered a pragmatic position. While I
agree that interpretive research does not subscribe to the idea that a pre-determined
set of criteria can be applied in a mechanistic way, it does not follow that there are no
standards at all by which interpretive research can be judged.

Research practice, considered as an act of communication, transfers and translates


relevant (or in some cases non-relevant) data and information regarding a
phenomena, from the site of occurrence or perception, to interested 'others'. It is a
process of reification, where data and information converted to knowledge; "by
transforming the recipient's state of knowledge, information turns into knowledge –
as matter may convert into energy." (Ingwersen, 1992: p.126)

Brookes (1981) also uses the physical metaphor when he proposes the field of
informatics as the fundamental social science. He regards knowledge as "information
as that modifies a knowledge structure in any way," and defines a knowledge
structure as "a live, information-seeking entity always striving to modify itself to be
in a dynamic equilibrium with the information it is receiving." (p.21) Orna and
Stevens (1995: p.16) suggest that: "the essence of the searching and investigating
that goes on in research is transformation." They also indicate that this process
requires a blend of craft and science, in that two distinct modes of thinking must be
employed. This communication may manifest through interactions with the physical
product itself (i.e. arising from situations of use), the wider social systems which
enable it to deliver service, or through the myth-making and cultural connotations
that surround it (i.e. through advertising, or being attributed with a certain cultural

026463 536
capital, as a subject of standards). As Neil Postman has it:
" . . . to a man with a pencil, everything looks like a list. To a man
with a camera, everything looks like an image. To a man with a
computer, everything looks like data." (Postman, 1993: p.14)

Sys-log data, apart from detailing single-person households can only provide reliable
usage information at the level of the household. While it is easy to capture data, as it
is an artefact or by-product of system usage, its meaning is questionable and may be
the subject of conjecture by those who 'wish to know'.

By reducing 'noise' in the data, for instance mitigating from the analysis inaccurate
button presses, or what may be a young child repeatedly pressing a button etc., can
allow for some reasonable conclusions regarding choices to be drawn. But it remains
that what can be inferences is limited by what is shown, what there is available to
interact with, and so such data cannot properly serve to guide innovation of service
and system. In the cold light of day it remains that this is still aggregate statistical
information based upon the availability and selection of pre-defined choices. This is
crucial within the frame of technological experiments and marketing trials. The
system, as in the case example, may be incomplete and lack proper functioning
which will always in the first instance constrain user behaviour and channel actions.
These are most definitely explicit examples of how technology can ‘open’ and
‘close’ options. To draw inferences from predominately quantitative studies of such
phenomena with or without proper allusion to the limits of the system, will be biased
and of limited practical value.

Considering such data will simply return managers and engineers back to relying
upon speculation. They will have to conceive what the outcomes would be if ‘things
were different’ – that is, if the system were fully functional. High frequencies
registering certain menu choices over others (very much like in website ‘hits’ or TV
‘ratings’) provide no insights to why a choice is popular. It may be because of
usability or information architecture (making some choices easier to locate and find
over others), it may be to do with the quality or relevance of the content. Not without
asking people does frequency in behaviour make any practical sense, or serve as a

026463 537
seat for learning and innovation.

In the first chapters considerable attention was placed upon feedback loops as a
means to learning and innovation. It was felt on the Cambridge trial that the system,
a communication media system, could provide new ways through which to conduct
market-user research. It was believed that sys-log was critical to this new power, as
was other propensities such as online questionnaires. Each would reside within the
system, and be processed into knowledge within the system. That is statistical
processing would become a sub-propensities of the system management as in where
a website can automatically record and report upon ‘hits’. But where interaction
possibilities are limited, and not infinite, it is impossible to deduce accurately why
something succeeds or fails. Deeper social research, outside the system is required
for that.

The notion of innovation in content is a goal and challenge to the establishment of a


new medium. For instance this enters into the argument put forward by the BBC
when they justify the licence fee through their production of advertisement free
'quality' television. For anyone who has visited the US, comparisons between the
quality of the BBC and advertising intensive commercial television in the US are
obvious. And no channel, regardless of the ratings would choose to screen, even if it
were possible, non-stop versions of the soap operas which currently dominate the
ratings of broadcast television in order to build ratings. Content, as with new
products and services will always have a ‘suck it and see’ style of risk associated
with its development and diffusion. Content, such as soaps, and products and
services dedicated to particular functions have a relation with, and are to a greater or
lessor degree other programming types. It is from this that they also derive their
meaning. A new car competes based upon revised features and functions
contextualised against other cars in its price/quality category. But where there are
limited analogues there is less defined hooks for the user-consumer to link the new
with the old, there are more ‘grey areas’ which need to be explored.

Very much analogous to the manufacturing sector the notion of 'continuous

026463 538
improvement' must allow for a free-market ethos of change, variety and choice, as
part of the environment which lends value and meaning, context to a product, and the
act of viewing, using or consuming. But the dominant logic of mass media mentality
applied to the new interactive forms, still dictates that there is 'one best way' which
can be found out through the technologising of audience research. These
technologies are a form of husbandry of consumer-users, and this was very evident in
the approaches adopted by the working group responsible for user research on the
trial.

System-logging is potentially a powerful means by which to capture use and usage


data. It is gathered at a premium by web companies who, like the television
networks, wish to verify their 'hit rates' – the amount of times web users come to
their sites. And as with television such data translates easily into advertising
revenues. But how much time do people spend on their computers compared with
watching television? How many people are accessing the web compared with
watching television? Television's mass market potential is again raised, but this time
with respect to developing consumer-user intelligence. Encountering and tackling the
issues of exploiting sys-log data, provided the chief motivation for the private sector
social research agency to get involved in the trial.

The ubiquitous television set will have a connection to someone else's computer. Is
this an invasion of privacy? Burke (2000) suggests a dystopian view where i-Tv
creates 'experimental conditions' in the home:
" . . . machines that control your TV set will show you something, check to
see how you react, and then show you something different. That's not just
convenient. It's a loop of stimulus, response and measurement as carefully
designed as those boxes where rats hit buttons to get food and avoid electric
shocks."(p.6)

In this chapter I have argued that we need to move away from the prerogative of
categorising and labelling people, and to properly consider the relevance and place of
the types of knowledge and understanding firms are after, and the ends they will
serve. This inevitably implies that we need to re-consider our own role as
researchers. Biases and preferred way of knowing should first be made explicit, so as

026463 539
to begin to speak a new language in which overt subjectivity, rhetoric, argument and
metaphor play a central role.

Approaches such as critical phenomenology, naturalistic inquiry, and hermeneutics


begin to address some of the dilemmas that were raised in the present study. I have
suggested that these approaches maintain elements of action research and
phenomenology that use qualitative research methods (such as field research,
descriptive research, and ethnography), in which the role of observer/participant and
interpreter was played out. Contextual usability in this sense was the explicated pre-
[understanding] which guided the case study interviews. Research becomes more
than a data collecting activity, in that it actively seeks to understand as well as to
improve the product, its various social constituencies and organisations, through
simultaneous action and reflection with all parties involved. A researcher has the
moral obligation to work for and with the participants in his/her study in order to
justify his/her presence in an innovation project. Design evaluation research should
have a pedagogical end in the sense that the participants somehow benefit from the
research. If it does not, is it truly 'good' design? Research should not just be an
attempt to learn about people, but to come to know with them the reality of products
and services that supports and challenges them as they go about their daily
existences.

The methods employed in this study had to constantly adjust to the exigencies of the
trial. Of particular note here was the constantly evolving technical potentials, its
problems, as well as my relations to its stakeholders. The methodology also evolved
to cope with developments in the theoretical and epistemological aspects of the
study. There was constant appraisal of the most relevant way to extract knowledge
out of the dynamic conditions of the trial.

There are many problems which are bound to plague 'real world' research projects.
As research tends to 'smooth out' and make coherent aspects of what can appear as
the chaos of the 'real' world, likewise theory “is supposed to be a guiding light that
orders observations and imposes pattern on a overwhelmingly complex world”
(Poole and McPhee, 1985). Unlike the controlled conditions of the laboratory,

026463 540
research within firms, or within people's homes is subject to a variety of influences
and mitigating factors which can obfuscate and confuse important events and
occurrences. Many of these may have more importance to the design of a technology
than what may be hypothesised, or understood by the frames of traditional marketing
or usability research.

Pragmatically, a contextual inquiry of domestic technology use invariably entails


some form both of smoothing (chaos of the real world setting - gaining access,
commercial sensitivities, acquiring trust whilst maintaining an ethical position) and
looking for patterns. It inevitably involves field work, either visiting peoples homes,
conducting interviews in the company premises, visiting i-Tv installations in
museums, and taking part in round table discussions (all of which were undertaken in
the present study). While this can be privy to the widely understood problems of
implementing fieldwork generally - i.e. subject, time and resource management,
logistics, and other factors which can 'mess' things up, such as restrictions due to
commercial sensitivity of the subject, it remained possible to some aspects of the
unexplored territory of domestic use of new media.

The underlying ethos of the present study was exploratory. Interpretative research
such as ethnographies are holistic and contextual in that they seek to place
observation and interview data into a larger perspective: "A central tenet of
ethnography is that people's behaviour can be understood only in context; that is, in
the process of analysis and abstraction, the ethnographer cannot separate elements of
human behaviour from their relevant contexts of meaning and purpose." (Boyle,
1994: p.162) But ethnographies can consist of a blend of both positivist and
interpretist approaches, working in tandem to permit insights to the use process. The
table below outlines how experiential qualities may be mapped in a contextual
usability study (see table below).

026463 541
Table 4.2 Components and attributes of CU
Who What When Where
use Gender Current media tastes (print Usage Where current media
Age and electronic) patterns of technologies are used
Status media
Relation to Attitude to technology consumption
other (particularly domestic
household technologies) Overall
members* leisure
Technical Media technologies activities
proficiency individually possessed (focusing
(VCR, particularly
Computer Access to household media on the
etc.) technologies position of
Level of media in
Technologic Anticipation's of i-Tv context to
al/ system's features and other
computer functionality activities)
Anxiety
IQ
Personality
usability Which type Is the system easy to access Does lifestyle What role does the
of person on initial confrontation? and leisure location of use, have
finds the patterns on usability?
system What role does attitude to affect
easiest/ technologies affect usage? usability?
hardest to
use?
usage Do How is usage of the system What is the Do certain locations
individual affected by domestic play-of lend themselves to
differences sociology (including , social between the formation of
relate to dynamics, censorship and leisure particular usage
styles and rules concerning use)? activities, patterns?
patterns of hobbies etc.
use? and use of the
i-Tv system?

usefulness Which type What is the perceived value How does the How easy on initial
of person of the i-Tv system use of the i- confrontation (i.e.
derives the compared with the other Tv system point of sale) with the
greatest pursuits? develop and system is it to assert
benefit, change over usefulness?
satisfaction time?
and
enjoyment How does it Will certain services
from the develop with be more highly
system? the addition valued in certain
of a new places (i.e. word
service on- processor in living
line? room, VOD in
bedroom etc.)?
*apart from using self-report methods to assert previous media exposure habits
** assuming for instance, that the individual is a member of a household consisting of more than one member

026463 542
In addition to the examples mentioned above, such as limitations of the
demonstration Set Top Box technology402, and desires to maintain commercial
secrecy, there are those particular human, social and cultural factors which dictate
their own variety of problems hindering the flow of constructive knowledge from
trial enterprises. The selection and use of a relevant subject sample for laboratory-
based usability testing, or indeed a user analysis, of a new spreadsheet package may
be meaningfully conducted with a random sample of spreadsheet users. They are
inherently a select and distinct category of users who can easily compare what they
know of a previous version, or similar product, with what is presented to them. Their
experience of using dictates their predilections to what is different, what may be
valuable, desirable, useful.

Nevertheless the testing of what may considered as more radical innovations often
lack a suitable, clearly distinctive or even reliable existing user-consumer base. This
makes it much more difficult to incorporate 'average' users or consumers, in order to
suggest user- or consumer-centred opinion on perceived value, added advantages,
improved qualities, increased performance, and/or enhanced modes of operation and
attractiveness.403

Conversely, technologies enjoying wide diffusion and use, such as television, as well
as the attraction of manufacturers of consumer electronics towards new products for
the mass market, pose a particular problem for testing. Particularly the notion of
choosing a sample.404 Should such technologies be dedicated and designed for use by
any television user? Or will separate functions of interactive television services, for
instance games, education programmes, be dedicated to a certain age group with
cooking programmes, DIY etc. dedicated to another? The testing of such
technologies becomes more problematic depending on who is chosen to use; which
particular functions; and which particular content material. Making television
402
I was given a demonstration set top box which was ‘free-standing’ - it produced its content from a
hard drive. It was capable of only offering limited links and avenues via its menus. It did however, act
as a functioning prototype which could be used for the purposes of usability testing .
403
This aspect of innovation has produced the largest problems for processes such as QFD, where the
prescribed methods of attaining the ‘voice of the customer’ rely on them distinguishing qualitative
differences between new product lines and categories from those under development.
404
I have not discussed the process of sampling here but it is an integral part of many [experimental
and survey] research projects, and possesses its own set of methodological problems.

026463 543
interactive may mean very different things to different people. Television as it
already stands, and which will be further explicated in the following chapter, does
not necessarily have a fixed place, a fixed set of meanings in lives and homes, and
certainly not meanings directly predicated by demographics, or socio-economic
indicators.

Adopting a very broad outlook we may suggest that use orientation and function of
television is different within people's lives than, for instance, their use of computers.
For the most part, and relevant to the sort of discussions in this chapter, the use of
television is for entertainment, leisure and relaxation, its use is less focused, and may
be watched in a non-specific way. Alternatively, the use of computers tends to be
very task orientated, very specific and very focused. The notion of what people 'do'
with media information is elusive, non-testable, difficult to qualify and quantify.

However, it remains that one may easily register use in the form of the pushing of
buttons, navigation etc. - through a networked system. Such registration may provide
for which household switched on when, what they did in a session, interpretation of
such data is incomplete without some perspective of what such use meant during the
session. How this may relate within the contexts of the consumer-user's other tasks,
activities, pastimes, beliefs, tastes, attitudes etc. The on-going debate in computer
supported co-operative work (CSCW) tends, as does much HCI studies (including
Contextual Inquiry) towards how the system may genuinely enhance social tasks and
functioning, as well as how the system acts as a suitable medium for communication.
This is something that may perhaps be measured purely in productivity terms, but
ascertaining how 'entertaining' or 'absorbing' a programme is presents a much deeper
problem of interpretation.

The use process provides a crucial pivotal point in the analysis of the appropriation

and consumption rituals and habits which form through and around the use of
technologies. It provides analysis for the meanings that a technology evokes
throughout its creation and domestication. While working on the premise that

026463 544
technologies do not fall upon us from Mars, that being the products of shared social,
cultural and economic contingencies, contextual usability should offer some
explanation of what drives and encourages their development and use. The use
process may be broken down into elements which define appropriation, consumption
(purchase), consumption (use), negotiation (shared use), and interpretation (of
content).

The use process is also a mosaic of developmental elements that define and shape,
usage patterns, an expertise of use and ability to use, attitudes regarding the
usefulness of the product. A focus on the social, behavioural, cognitive and cultural
factors influencing (or equally, not influencing) use carries something of the notion
of the 'everyday'. It also caries over the idea of making the unusual (i.e. the new
product entering into the arena of a person's everyday life) into the usual (a product
which has successfully meshed into the affairs, routines, tasks, and requirements of
quotidian life). As such, CU encroaches upon a number of existing and emerging
research areas which have developed theories and research methods regarding
product use and consumption. In particular, recent developments in consumer
psychology, media and communications studies, audience research and cultural
studies can now offer relevant insights into use processes. These methods now often
feature the inclusion of more new paradigm social perspectives on product-user-
consumer integration and can extend knowledge of product over that generated by
the normative quantitative research paradigms of traditional marketing, consumer
research and usability engineering.

Conclusion

In an age when the user is becoming of greater focus to producers, creators and
manufactures of content, technologies and services, it remains that there must be
some degree of innovation of methodologies aimed at enriching understanding of the
user-consumer. Brown et al. (1989) argue strongly that knowledge is situated, being
in part a product of the activity, context and culture in which it is developed and
used.

026463 545
It seems reasonably clear that beliefs, attitudes and perceptions are best explored
through the more interpretative 'new paradigm' research approaches, whereas actual
consumption and use behaviours are best illustrated through the emerging
technologically enabled means of tracking use. Put together such a combined
approach should enable more accurate and broader understandings of what people
are actually doing with the systems, and what this use means within the broader
context of their everyday lives.

It is the basic premise of this book that such multi-method research is the obvious
answer in exploring the use of on-line interactive media. However how these
methods are employed and how they and their results may combine in an informative
way still remains a problem. It has been recognised that: "Researchers in
management of technology often seek to frame studies that simultaneously address
practical issues and more general questions of interest to academic colleagues." (Von
Hippel and Tyre, 1996) This has been long recognised as a problem within the social
and human sciences generally. The splits between those advocating one particular
method over another and confuses the potentialities of combining methods under a
broad theoretical 'pigeon holing system'. CU focuses on the event and situation of
use. It aims to create a 'meeting place' where various data may be brought together,
where case study - may draw upon different modes of inquiry (i.e. sys-log,
evaluation research) to create 'thick' descriptions.

026463 546
PART TWO
CASE STUDY AND CONCLUSION

026463 547
Chapter 5 – Acorn and their technology

"Built by engineers, used by normal people"


(Hewlet Packard advertising statement)

Introduction

The following chapters focuses on a study of Online Media (Om), a division of


Acorn Computers Ltd., and the project which came to dominate their activities – The
Cambridge Interactive Television Trial. Several interviews were conducted over the

026463 548
period from the firm's inception in July 1994 to the time of the completion of the trial
in April, 1997, when the firm started to dissolve. Several visits were made to the
companies HQ in Cambridge Technopark, where managers and designers were
interviewed, to advise upon user-research and also to take part in user research
meetings. These formalised interviews are augmented with various informal
communications, e-mails, and appropriation of company literature such as operating
manuals, press releases, and web site details.

The focus of this chapter is mainly upon the technology, in particular, software and
interface development. It also considers Om as an organisational entity that arose
from Acorn plc.

As such this chapter reiterates an important point of technological development - that


new innovations invariably often arise out of previously existing components,
expertise and knowledge. That is they emerge as configurations of historical
phenomena. That is, they represent ordered principles deriving from an often chaotic
mix of elements that are consciously brought together, and which in the beginning
have only partial ‘fit’.

Fit or lack of fit may be witnessed on many levels – hard technology and software
where incompatible components are made to operate. In this case there is much
‘learning by struggling’ as engineers problem solve how to not only make things
work but make them work well, make them robust and reliable enough for a
consumer product. Fit also acknowledges lack of, or gaps in, expertise. This is where
knowledge is needed to make components and interfaces fit and operate properly in
concert. It also concerns itself with organisational and governance issues. This
recognises that the technology and the institutions and organisation which support
and create them, link sympathetically with the other organisations who will use and
consumer them. Fit in this case is always the product of prospects and propositions
within and without the firm.

026463 549
Out of various kinds and levels of social and technical interactions, fit and lack of fit,
come iterations, paradigm shifts, innovations or 'bifurcations'. These in turn drive the
need for further technologies, social contingencies and/or knowledge. They highlight
and address ever new ‘gaps’, which must be addressed in order to realise robust,
satisfying and enticing systems for end consumer-users. Addressed they can motivate
or focus design effort and financing, ignored they raise the risk significantly on the
creation of an unusable product or one totally out of sync with market needs or
trends. Fostering knowledge of interaction and various kinds of feedback, shows how
it is critical to risk management in firms charting unexplored territory such as that
offered by the propensities of new communication networks. Failure in the following
example shows how even being self-critical regarding issues arising from interface
design was not enough to prevent

The case as illustrated in both this and the following chapter is poignant as it shows
clearly both the generation of a technical system with its discrete elements and
components, and the generation of various levels of supporting organisations, i.e. its
social dimensions. All large-scale technical systems possess intrinsic social
dimensions. A large-scale nuclear power plant is only built if it is forecast that
consumer of electricity will use the power it generates. A large-scale transport
system needs passengers to ensure its sustainability and return on investment. A
media and communication system it relies upon people to make it not only operate
and work, but also people to populate and animate the network. Not only do they
consume media from some central source or server, they also generate information in
terms of their use of the system being logged, or by them filling in online
questionnaires. This interaction, its nature and possibilities are shaped by design, and
moreover the organisations which denote this design. They are themselves parlay to
influences from more wide industry trends and through the actions of competitors.

The example presented here also shows that through their interaction, as well as the
influence of wider, external opportunities and constraints the technology, its function
and purpose radically changed without coming to market, without much input by
end-users, and little from prospective commercial users of the system. It shows how

026463 550
user input into this change was marginal, depressed under changing organisational
and commercial pressures.

This is perhaps most clearly shown in the development of the user interface, although
it manifestly is best understood through the radical functional attributes of the
system. The system changed functionally from one that would not incorporate
Internet access, to one whose sole functioning was Internet access.

Nevertheless, from a design perspective, the interface is of course the site that most
ostensibly straddles the realms of content and hardware. I have already suggested
that this is a site of intense activity, both in design and in use, and as we will witness
in this chapter represents one of the key areas that demanded attention when the
technical and creative elements were configured. It is interesting to note that here
new forms of expertise were deemed desirable, arising out of insecurities regarding
the efficiency and effectiveness of the initial, spontaneously developed interface. A
'graphic designer with HCI experience' for instance was seen as desirable for
revisions to the look and feel of the interface. While this is a hybrid expertise, and
which is only now being instituted through incorporation into traditional graphic
design and computer science style education, at that time - 1995 - finding such a
person posed significant problems. Motivations for this recruitment were based
entirely upon an internal decision by management and not based on any form of user-
consumer feedback on the performance or style of the interface.

The Cambridge Phenomenon: The environment of Acorn

The 'Cambridge Phenomenon' is the title for the explosion in small high-tech firms
starting up in and around Cambridge leading up to and throughout the 1980s. 405 The
405
The Cambridge Phenomenon is characterised by Segal, Quince and Wickstead (1985) thus;
 The presence in and around Cambridge of many high-technology companies (computing,
biotechnology, electronics & scientific instruments mainly).

 A very high proportion of young, small, independent and indigenous companies and a
corresponding low proportion of subsidiaries of large companies based elsewhere.

 A long record of high-technology company formation; Pye, Marshalls, Sinclair, Acorn.

026463 551
growth of these companies was matched by the attraction to the area of
complementary business/financial services and specialist R&D operations of big
companies based elsewhere.

The area has proved a breeding ground for high technology entrepreneurs (Segal,
Quince and Wickstead, 1985). This led to the tag of "Silicon Fen", due to the
similarities with Silicon Valley, just as "Silicon Glen" represents the high-tech
industry in West Lothian, near Edinburgh. In each case a strongly-scientific
university community is on hand to supply ideas, personnel and expertise to those
companies participating in R&D.

Acorn

Chris Curry and Hermann Hauser founded Acorn Computers Ltd in 1978. It was
originally founded as a microcomputer consultancy. One of their very first products
were 8-bit microcomputers for scientific and technical applications and an electronic
pin ball game. By 1980 Acorn introduced the Atom, one of the first home computers
aimed at a family market, originally sold via mail order. A notable feature of this
device was a video interface allowing it to use a TV set as a monitor.

The financial success of the Atom drove Curry and Hauser to seek further funding for
development work. The following year they won a contract to produce the BBC
Microcomputer. The BBC Micro is rated as one of the biggest success stories in
British microcomputing, selling more than a million units over its working life of
well over a decade. Its main market was education and it achieved considerable
penetration into UK schools and the higher education sector.

Production on the BBC Micro started in earnest in 1982, when it could boast a
processing speed twice as fast as its nearest rivals. It was also less expensive, and it

 A tendency for high-technology companies to concentrate on research, design and


development rather than production.

 The many complex direct and indirect links between the companies and Cambridge
University.

026463 552
included, as standard, many features which were at this time expensive options on
other computers (including, again notably, an adapter for Prestel and teletext –
videotext i-Tv precursors). In 1984, Acorn went public, and entered the early
pioneering multimedia arena. It was one of three partners in The Domesday Project –
heralded as an epochal large scale multimedia project with partners Philips and the
BBC Interactive Television Unit.406

The RISC PC and the RISC OS

By the mid-1980s, the computer industry was generally migrating from 8-bit
computer systems to 16-bit architectures, and in 1985, Acorn produced its first 32-bit
RISC (reduced instruction set computer) microprocessor and entirely skipped the 16-
bit processor generation. When Archimedes - the generic name for the first ARM-
based machines first appeared, it was without doubt the fastest microcomputer of its
time. This precedent continued with the advent of the RISC PC in 1987, followed by
the development of the RISC OS operating system in 1988 which featuring a
`Window, Icon, Menu, Pointing device (or Pull-down menu) - WIMP (windows,
icons, menus and pointers) based graphical user interface.

In 1989 and 1991, respectively the BBC A3000 and A5000 RISC micros were
released, the later with performance equivalent to the IBM clone 486 PCs. In 1992
Acorn won their second Queens Award for Technological Achievement for creating
and exploiting the ARM processor. This single development is what sustains the
company today.

406
It is interesting to note that the Domesday project was finished in 1986. The BBC Domesday
System comprises an extensive amount of both social and geographical information about the UK. Its
most valuable resources are the Community Disc and the National Disc. The Community Disc
provides information about places in the UK like city and environmental maps, detailed descriptions
of the facilities you can find at these places, and it gives you the local history of your area of interest.
Sometimes you can even find interviews with local residents. In addition, you can easily switch
between functions and sometimes call up photographs of landmarks to be found at these places. On
the National Disc you will find socio-cultural, cultural, economic and historic data about Great
Britain. The overall concept of the disc is that of a gallery in which you can stroll, but you have the
alternative option to call up specific data with the index and find functions. The National Disc also
provides a vast fund of photographs that cover areas as different as the marriage of the Prince of
Wales and Boris Becker's first victory at Wimbledon in 1985. With the Master AIV (Advanced
Interactive Video) system as a major component. The groundbreaking multimedia content was
supplied on two 12" double-sided laserdiscs and included images, text pages, video clips, data sets and
computer programs - the full range of multimedia material. However Philips intended to make a mass
produced machine by which the video disks could be viewed - this never materialised.

026463 553
Initially the ARM chip was developed by Acorn. It was further developed in
collaboration with other EC partner organisations under the aegis of the ESPRIT
MultiWorks project. This culminated in the delivery of the ARM3 chip. In November
1990, ARM Ltd. was formed by Acorn Computer Group, Apple Computer and VLSI
Technology.407

The RISC PC - had it a market?

While the Acorn RISC PC had significant advantages over its competitors in terms of
specifications they could not derive the economies of scale enjoyed by the PC
makers. This is reflected in its relatively high price.408

Molina in his study of the development of the ‘Europrocessor’ cites how RISC
(reduced instruction set chips) were seen as a viable competition to the more
dominant CISC (complex instruction chip), whose manufacture was dominated by
US firms Intel and Motorola (Molina, 1998). It was felt that technically, RISC
processors had performance edge upon CISC. However, failure to reach a concerted
407
As of 1998, ARM’s current market focus was upon three main areas of development:
 Embedded Control - i.e. mass storage, security, automotive and instrumentation to
printers, Smart Cards, modems, and communications systems, Consumer Multimedia
and Portable Systems.

 Consumer Multimedia - i.e. digital cameras, game machines, digital TVs, TV set-top
boxes, intelligent terminals, and GPS (global positioning satellite) devices. Examples of
multimedia products include Oracle’s Network Computer, Wyse’s Winterm400 series
network computers, Online Media's set-top box, Viewcall's Netsurfer and Teknema’s
Internet TV set-top box design and technology suite.

 Portable Systems - i.e. handheld PCs, PDAs, pagers, cellular phones and the emerging
crop of smart phones. Apple's MessagePad 2000 and eMate 300, the Psion Series 5
handheld computer, Geode's Cell- Phone Modem and Spyrus' LYNXS Security Card, all
contain ARM chips. Also within this category is OMI-NewsPAD , a collaborative
development which involves the Technology Management and Policy Programme -
TechMaPP - from The University of Edinburgh. ARM continues to work with projects
in the EU.
408
The middle-of-range RISC PC (4Mb+1Mb, 210Mb, 14" monitor), circa 1995, cost about £1400
pounds +VAT at 17.5%., which compared to a similar performance PC was very expensive (top of the
range non-Pentium PCs were currently offered as low as £500). Also Acorn’s US market was very
little or at all developed. Both hardware and software are exceedingly difficult to get hold of, even
though they boasted that the RISC PC was launched with over 3,000 software applications already
available (through backward capability to run already existing RISC OS software).

026463 554
view upon RISC features and characteristics failed to build a wider aligned European
constituency that would support a single chip.
The true complexity of the early constituency building would lie in the
generation of a positive programmatic alignment which would follow the
following operational questions. How do we do it? How can we best achieve
the ultimate goal? In this regard , there was no guarantee of success. A lot
would depend on the ability to strike the right amount and quality of
ingredients at the right time and, in this, there were no off-shelf recipes. Soon,
the participants’ commitments was to be severely tested by time and/or other
pressures, complications and contradictions which might lead to a perception
of too much misalignment and, perhaps, frustrate the whole experience . . .
Such misalignment undermined the Europrocessor . . . the reason is that the
negotiating parties sought to agree on a single microprocessor architecture
when several of them were already beginning to to support different
alternatives.

As such the RISC constituency subsequently failed to build enough market strength
to compete with the widely diffusion CISC processors. It can be viewed that the
RISC PC suffered partly as a result of this. A single ‘Europrocessor’ would have at
least guaranteed the prospect for more software development by third party firms.

On the 29th July 1995, just over a month after Acorn issued a profits warning, they
made a statement that its personal computer business was continuing to experience;
“difficult trading conditions which will again adversely affect results”. 409 That year,
however, they had already produced some 80,000 machines, but their market niche,
predominately the UK higher education sector, was beginning to dry up. Money for
upgrading computers were low on the agenda in budget-strapped schools. In short,
the RISC PC was not selling anything like the numbers that had been anticipated, and
this placed further emphasis to diversify Acorn's operations and top explore new
market opportunities such as i-Tv.

Sam Wauchope, the then Acorn managing director, made a statement to the effect
that with the maturing of the traditional personal computer market, Acorn's strategy
would be to take their technology skills and develop them in faster-growing sectors
of the market; this was envisaged at the time as i-Tv. The intention was to invest

409
Source Financial Times 29th July 1995 ‘UK Company News: Acorn chief resigns ‘

026463 555
£13m, raised by a public flotation, over three years in Online Media (Om).410 One of
the conditions for the capital was that Om should conduct a large-scale technology
and marketing trial.

Wauchope envisaged participating not only in the trial in Cambridge, but in six more
tests world-wide over 1996. He was confident that the demand for set-top boxes
would start to rise considerably as from 1997, when sales were predicted to run into
the 'hundreds of thousands'. However, positive cash flow from the Om experiment,
that is from its commercial roll-out, was not expected before 1998. Wauchope was
aware that the group would probably not be able to ‘go it alone’ at these levels,
which would be driven mainly by expansion into the Far East or USA. This would
demand that they find and align the right partners. It could not be funded entirely by
themselves. Important to note here is that failure in their established market due to
extraneous influences (i.e. shool budget cuts) of their new products hit Acorn hard.
Chip and PC makers and others such as periphery manufacturers rely upon IT
budgets and requirements for a succession of new, improved machines. Failure to
secure this perennial demand requires such firms to seek out new markets and to
hopefully ‘repackage’ their expertise and machinery to suit this. Acorn came from a
background of entrepreneurial risk which had paid off. This same trait drove them on
into the high risk arena of STBs and i-Tv systems (as well as to explore to a much
lessor extent the development of a multimedia slate – Project Newspad).

The organisation of Acorn

Acorn at this time was an organisation in constant change. Structurally, in terms of


its operating divisions and project groups, as well as in expertise and size. As already
suggested, it was also changing its strategic focus almost at a whim. It ran on
perceptions and visions of opportunity. As such it was high-risk. However, as
indicated earlier, being located within the thriving environment of the Cambridge
Phenomenon provided them a rich environment within which to develop and network
with expertise, as well as recruit new personnel. So from a human resources
perspective they were well provided for. Incoming management expertise (as shall be
410
Source Financial Times 23rd Feb.1995 ‘UK Company News: Acorn calls for £ 17.2m

026463 556
more fully explored in the next chapter) included visionaries who could also
rationalise and formulate strategies and organisational structures which could get
things moving.

In Cambridge, ex-Acorn personnel and managers were directly involved in building


many of the local high-tech companies, and had developed strong links with other
organisations in the area, including The University of Cambridge. 411 As of 1995, it
had several operating divisions, employing approximately 130 people, which were
linked through their exploitation of RISC technology. Om was designated the role of
developing and managing products and services in the interactive Television arena.
The Acorn Network Computing division was responsible for devising and producing
the reference designs for Network Computers (NCs) for the Oracle Corporation.
Acorn RISC Technologies designed and licensed technology to partner companies in
the USA, Korea, Japan and Europe.

411
In the Dec. edition of Personal Computer World the notion of the cross-industry academic links
were emphasised by Alec Broers, Cambridge University's new vice chancellor. He is reported to have
emphasised that in the face of massive government cuts, the university has to build new bridges with
industry and get its know-how out there earning good money. ( source -‘Why all IT roads are leading
to Cambridge’ Personal Computer World Dec, 1 1995) Interesting to note at Acorn was there was a
considerable ’recycling’ of personnel with Acorn start-up companies and other firms in the area.
Many people had previously worked at Acorn several times. In-between these contracts they had
worked elsewhere and had since returned.

026463 557
Fig. 5.1 The organisation of Acorn circa. 1995 (courtesy of the Acorn Computer Group plc)

The above diagram shows the organisational levels and entities associated with the
Acorn Computer Group plc circa 1995, which, from an organisational perspective
may be considered the company’s peak.

Since the previous year, that is, autumn 1994, the Acorn stake had come under the
umbrella of a new subsidiary, Olivetti Telemedia, which groups together all the
Italian company's multimedia interests. As of 1995 Olivetti was looking to Acorn to
help it take advantage of forecast growth in multimedia systems and services, with
Elserino Piol, the vice-chairman of Olivetti, appointed chairman of Acorn in
November, 1995.412

Acorn Online Media -The genesis of the Cambridge Trial

Acorn's interest in interactive television was generated by an interaction with an


outside agencie that carried considerable influence. The ‘initial spark’ occurred much
earlier than many people involved with the project were ever aware of. Indeed, this
was a project that continued to be steeped in secrecy and confidentiality. The stakes
were very high indeed.

i-Tv trials were being conducted using many competing systems around the world.
More locally, in Ipswich, their closest competitor, British Telecom (BT), were also
staging a trial which used an alternative network and an Apple designed STB. 413

There was much interest in the project of ‘next-generation’ or ‘enhanced’ television

412
As the diagram shows, Olivetti Telemedia are currently the largest investor in the Acorn Computer
Group, with a 31.2% shareholding. Acorn Computers Limited is part of the Acorn Computer Group
plc, which went public on the USM in 1983. The Acorn Computer Group has a 43% shareholding in
Advanced Risc Machines (ARM) Limited; a 44.5% shareholding in NChannel International Limited,
a 50% shareholding in Xemplar Education Limited, and has 100% ownership of Acorn Computers
Limited. Acorn Computers Limited is currently made up of 3 operating divisions: Acorn Risc
Technologies (ART), Acorn Online Media (AOM); Acorn Network Computing (ANC).
413
BT's trial of video-on-demand in the homes of 70 staff living near the BT Research Laboratory at
Martlesham was based on ADSL (video over the copper telephone line) technology. This trial ran
almost concurrently with the Cambridge Trial. What is less well publicised is that some of the homes
are connected by fibre and that BT is as interested in testing the conceptual approaches to service
development as it is in trialing the technology. BT's earlier fibre-to-the-home technology trial in
Bishop's Stortford ended in 1993.

026463 558
from a multitude of players including hardware and software firms, as well as a
plethora of potential service providers and market research agencies. There was so
much to learn of this new delivery mechanism and how it may be best employed.
This learning was a commodity that had to be facilitated, managed, packaged and
maintained. This inevitably meant high levels of confidentiality.

During a trade show in 1993 representatives of News Datacom approached the Acorn
desk. A broadcasting systems developer, best known for their development of
smartcard-based scrambling systems for SKY TV. News Datacom are closely tied to
Rupert Murdoch's News Corporation and so represented a considerable proposition
for Acorn.414 It must be remembered here, that Acorn had already claimed their
previous success (the BBC Micro) through their collaboration with another
significant broadcasting company – the BBC. As such News Datacom carried
considerable influence. As a direct result of this meeting, senior Acorn personnel
held a number of internal meetings and then with representatives of News Datacom.
They discussed, at length, the potential of Acorn's RISC technologies for i-Tv. These
discussions came to act as a focus for development trends inside Acorn:

". . . obviously Acorn is always looking for new machines . . . new markets
so this was a potential we wanted . . . we internally thought that our
technology was very well suited (to what News Datacom required), and had a
lot of strengths . . . our machines were very programmable in terms of a video
system so it was easy for us to generate very good TV signals . . . we had a
lot of software technology such as the RISC OS which good do presentation
on a standard television much better than the average computer." (Sonya
Tagert - Acorn Chief Scientist)

This relationship with News Datacom lasted for some time, perpetuating technology

414
At its height in January 1997, News Corporation made an offer of $450 million to purchase the
company. However, the offer was withdrawn in March. While there were rumors that it was
withdrawn due to issues with the price and revenue projections, James Murdoch said it was due to
PointCast's inaction.[4][5]

Shortly after not accepting the purchase offer, the board of directors decided to replace Christopher
Hassett as the CEO. Some reasons included turning down the recent purchase offer, software
performance problems (using too much corporate bandwidth) and declining market share (lost to the
then-emerging Web portal sites.) After five months, David Dorman was chosen as the new CEO. In an
effort to raise more capital, Dorman planned to take the company public. A filing was made in May
1998 with a valuation of $250 million. This plan was abandoned after two months in favor of looking
for a company with whom to partner or be acquired.[5]

026463 559
development at Acorn. However, no fixed contract arose between the forms arose. It
may but may be viewed as a kind of ‘flirting’ relationship. Acorn remained quietly
confident that they were on the cutting edge of STB development, and had
competitive advantage over others in the field. News Datacom were remaining open
to alternative STB solutions, and were moving between R&D departments of
companies the world, evaluating technologies and most possibly motivating others to
work on similar projects.

News Datacom's interests lay in the advent of digital satellite television that was
clearly on their horizon. They were aware of the need to be ahead of the field,
particularly their competitors in the Cable world. This relationship was the first in a
number of encounters that would shape both the strategic directions and product
development of Om. However it is clear that this relationship sparked the effort to
build a socio-technical constituency. Most tangibly this would include first the
development of Om, as a separate operating division, and more latterly the
Cambridge trial.

Towards the end of 1993, moving from ideation and discussion to concrete design,
Acorn started to put together a specific system, and started investing in relevant
specialised technologies. These included emerging components such as external
digital video de-compression hardware which were very new technologies at the
time. This focus also reiterated tone of the core capabilities of Acorn machines from
the very beginning - that of TV connectivity. Acorn already possessed considerable
experience with digital video software through previous work done on the RISC PC,
as well as the BBC Domesday Project.

1994

By early 1994, the year US vice president Al Gore introduced the notion of
'information superhighways'415, and the WWW was really beginning to make its mark
in the public arena, the fledgling Om was busy recruiting more and more specialist
personnel. This was because they were now building very specific video hardware,
415
Financial Times Sept. 19th 1994 ‘Plugged into the world’s knowledge’

026463 560
truly focussing upon STBs rather than PCs. With an initial team of twelve, mainly
drawn from Acorn's technical expertise (including its technical manager - Dave
Swallow, and senior technical designers), they identified the actual engineering that
lay behind the creation of set top box 1 (STB1) – the initial prototype. STB1 would
act as the blueprint that guided Swallow in his choice of expertise and technology.
Here is theme that constantly recurred in the story of Om and the trial. Technology,
or technical thinking, guided considerations of a social or human nature. In this case,
the propensities, features and functions of the emerging STB, acted as a ‘departure
point’ which ‘communicated’ the need for new human expertise.

Swallow was clearly the major influence - the project champion or constituency
builder (i.e. Molina, 1989; 1990) - co-ordinating relevant staff and expertise for
different features and functionalities of the box:
"[Dave Swallow] had been a technical director . . . makes sense regarding
why it actually happened . . . [Dave Swallow] basically hijacked people who
had been working at Acorn who'd been working on the RISC PC, or [Barry
Holstein] had been working on the Kryton project which is the replacement
for the low end machines and that's where the casing design came from . . it
was all kind of like everybody was brought in with their different bit, with
what they'd done." (Terry York - Interface Designer)

Progress was to move fast on the development of the system. What was noticeable
was that there was from the managerial perspective a definite and conscious blending
of the social and technical elements contributing to the creation of product. Such an
idea does flag up the actor- network concept of interchangeable human and technical
elements, and perhaps more specifically that of translation. Personnel were drafted in
with their component expertise and with that particular component.

The box itself, then, comprised of what was available 'off the shelf' from other
ongoing projects, both technically and socially. The specification of these parts
dominated the initial conception, design and physical shape of the initial prototype.
The heart of the system was of course the microprocessor – the ARM chip -which
had been developed for the RISC PC. STBs are essentially PCs which have
enhanced video processing and infrared control.

026463 561
Only later was adaptations made which optimised components and made them more
compatible. This required some considerable knowledge of the company and its
various projects, as well as their state of development.

The core group of people that were needed for the original development work
became what was the fledgling Om (or Spinner as it was originally named) in
March/April 1994 - a team of around 15. This team included a couple of hardware
designer/engineers, a dozen software engineers, project managers, and technicians.
As the technology and business developed, others were recruited to fill noticeable
holes that were appearing. Some of this expertise had to be drafted in from outside
Acorn, particularly for specialist aspects of the technology such as the video systems.
At the time of the interviews (June,1995) Om personnel had risen to around 40. This
represented a dramatic rise in personnel (from 12 to 40) within a year, and was even
described by the hardware director as 'frightening'. Certainly, each visit to the Om
headquarters witnessed new desks being added or new wiring being installed in the
walls.

RISC PC and the hardware of the STB

As previously indicated, prior to the inception of Om and the STB, Acorn had
already developed (but not fully realised or manufactured) the RISC PC. The RISC
PC was the dominant technical inspiration of the STB, forming the blueprint of the
system. It had a printed circuit board (PCB) which more or less met with what was
required by the specifications. Falcon was another project that was running
concurrently at Acorn. It had a case roughly in the right size and shape that was
needed, so that was commandeered as the case for STB1. There was a power supply
unit (PSU) that was in an early stage of development that was also needed for
Falcon. This was completed and also incorporated into the STB. The most significant
difference was the inclusion of several cards which would permit MPEG decoding, a
network card for ATM, and another card for infra red (IR) control. These were its
key ‘television’ elements.

Some of the designers were aware that the chief competition at this time – the Apple

026463 562
STB (used on the BT trial in Colchester) had used a stock case - and so they
understood that Apple had also cut corners by scavenging off-shelf components. At
this stage one of the biggest problems that the team had was being able to create a
design of an MPEG card by which to fit in a box which had not even been moulded
yet, and operate to the specifications of the machine. A Sunderland-based company -
Wild Vision - provided the essential video card technology, and Acorn software
engineers spent considerable time developing appropriate software to connect it to
the motherboard.

Evolution of STBs: configurations of technology - expertise and luck?

Barry Holstein one of the designers summed up his perception of the evolution of
STBs and their enabling technologies:
". . . STB's have a natural progression. They progress from things that have a
CD-ROM in them to things that have networks in them . . . what we're trying
to do is actually move material up and down, masses of information into
user's premises. So this is an excellent way of getting yourself started, getting
all your software in place, being able to mix audio, video text, graphic,
pictures, you name it via the CD drive while the networks aren't there yet,
because to support this level of technology you probably need something
around a two megabyte a second interface which is roughly what you get with
a CD-ROM. But of course this sort of communication network is just not
available and it just so happened that ATML was in their first network
strategy on ATM and it was based on a two megabyte per second link."

The functional definition here was then, that what they were dealing with was
actually a multimedia platform, which would be networked, using a high bandwidth -
broadband - communications link. This placed new needs on extraneous technologies
(communications system and network card) which had been developed and were
available locally (through a sister company). The supply of a two megabyte per
second signal was critical for video on demand (VoD) using the MPEG-2 standard.

Barry Holstein saw that it "was all slipping slowly into place." This emphasised his
view that luck entered into the equation of design forces, however he did recognise
that "lots of people were working towards the same end." In other words ATML
were aware of exploiting their technology towards such an end, and there had existed

026463 563
some realisation and understanding (obviously at Bird's 'inventory' stage) that the
technology could match the specifications. Barry Holstein went on to link the notion
of convergence between companies orientated in terms of the technological
capabilities of the STB:
" . . . STB is about three different groups of companies. It's talking about the
consumer market, the televisions and all the rest of it, its talking about
communications, the telephone networks and its talking about computers, and
they're all working towards the same ends really, and that's when these three
meet you get STB."

A further focus was communications. An obvious choice was a local company -


ATML - with strong connections with Acorn (Herman Hauser, a founder of Acorn
and who remained as one of the directors) who were early proponents of ATM
networks which were considered to be the most likely network interfacing protocol
on the market for broadband communications such as was intended for the
Cambridge Trial. They also happened to have produced a network card that fitted
into a RISC PC board, so that was quickly commandeered and incorporated into the
STB. ATM as the communications link further focused the orientation of the project:
"ATM networks were talked about from the beginning because . . . one
reason would be ATML . . . because we had close links with ATML . . . the
buzz was that ATM was going to be the high speed mechanism needed for
[the high bandwidth traffic] . . . certainly we concentrated a lot in the early
days on the ATM technology . . . these days we don't think twice about demo
boxes with hard disks in that sort of thing, but in the early days we very much
concentrated on ATM, I think it was may time we actually set up a link, we
had a RISC PC server down at ATML, down at Mount Pleasance, and we got
Cambridge Cable to make up a fibre link up to Acorn House and fed it to the
boardroom, and the idea was we basically plugged it in . . . and put a little set
top in the boardroom and had it booted off this RISC PC the other side of
Cambridge . . . it was all very primitive stuff but it showed the technology
working . . . the STB had a hard disk in it as well though, we'd run all the
screens and the MPEG would be played off the set top HD, but the files
would come over the ATM, we hadn't got the sound working yet, we couldn't
stream sound we were more interested in video at this stage . . . nobody
actually commented that we didn't have any sound, they never actually
realised that we didn't have any sound coming over the ATM."

Demonstrating the box to members of Acorn's board illustrated to top management


that the project was technologically feasible. It represented a significant opportunity,
which, regardless of its functional pitfalls, was a sure example not only of Acorn's

026463 564
ingenuity but also the suitability of their technology for this particular purpose.
Support for the project at this level was imperative, due to high risk. Barry Holstein
alluded to the fact that overall Acorn did not have clear ideas on where it is going,
suggesting that they are perhaps more opportunist than entrepreneurial:
"You've got Acorn who doesn't quite know where its going, by that I mean its
always been very strong in the education market. Its tried other opportunities
both overseas and consumer business and never really succeeded in either. It's
education department, or its education market, but that's going nowhere
because the education budgets keep getting stripped. So you've got Acorn
looking for somewhere to go."

The early prototyping and development of the STB appeared to be fuelled partly by
chance, the right time, the right person, and the right parts. However, it took real
industry on the part of the design and management team to gel the various hardware
elements to work together. It was also going to take money. Of critical note at this
point however, was that the first demonstration box was not only to impress outside
agencies such as News Datacom, but moreover, served as a ‘Trojan horse’ by which
the project could be properly kick-started within the company. It was built initially to
capture the imaginations of Acorn senior management.416

It was clear that what was being advanced was the commercial impression from the
technologist's viewpoint. This had televisions as symbolic of mass consumer
markets. It first considered television as a technology, and not a medium. Like
television, the STB, the demonstration machines and the Cambridge trial all
demanded prototype content and interfaces through which the technology could be
appraised and operated by whoever needed to be impressed. This laid the foundation
and built the need for recruiting further management and technical expertise. It also
laid the foundation for further organisational development, commercial strategy and
business modelling.

416
This process of using demonstrations parallels the stories of Alexander Graham Bell when he went
‘on-the road’ with his telephone apparatus, of Edison with the Pearl Street generating station, and of
Baird with his various demonstrations of early television in Selfridges department store. Each of these
were essentially demonstration pieces designed to create the constituency, or to propagate interest and
funding for further development work. As in the case of these early projects, this first demonstration
piece set its own trajectory of development within Om, one which would lead to an almost
independent avenue of demo box development, apart from and separate from development of ‘actual’
boxes for the Cambridge Trial and other uses.

026463 565
The development of the interface

The same team responsible for producing system software that would link the
various cards and other technology to the motherboard, had a concurrent task of
developing software and screens which would provide some comprehensive
indication of what an i-Tv service would look like. This was the first excursion into
the development of content and what was to become a distinctive semi-autonomous
development thread.

This was the generation of a succession of stand-alone demonstration boxes.417 These


were STBs that were not networked-based, but in fact had menus and content stored
in, and accessed from an internal hard disk. These units would connect to a television
and were supposed to demonstrated idealised operation and functioning of a fully
networked device. As such they were to serve as the embodiment of a fully
operational I-Tv system. They could be taken to trade shows, and to prospective
visitors offices. They could also be used for exploring the usability of remote control
and other, hardware and software interfaces.

In many respects they were the very antithesis of what has come to be understood as
a ‘thin-client’ or Networked Computer (NC), where one has a terminal which
contains no software itself but relies upon connection to a network and servers that
can store, maintain and offer software. The business model of such devices were of a
‘pay-for-use’ scenario. Advantages of such a system were that users would avoid
high costs of software and licences, and could benefit from automatic upgrading of
software as releases were made. This was viewed at the time (perhaps in a similar
antagonistic frame to the development of alternative RISC-based chips) to breaking
the market domination of Wintel (Windows and CISC, or Microsoft and Intel), and
opening up the prospect of low-cost computing for the masses.

Previous to making demo boxes Acorn had been writing very small programmes to
show specific aspects of the STB system. These were drawn from, and adapted from
previous software projects. An Acorn marketing manager had previously worked
417
These were basically STBs whose content provided by an internal hard-drive instead of a network.

026463 566
with a company called CADSOFT Ltd. that created presentation materials on Acorn
computers. Under tight conditions of secrecy, Om commissioned them to do the first
presentation demo of the STB. They put together a few screens, and while the result
was considered positive, the resultant demo looked much more like a computer than
interactive television. CADSOFT Ltd. had been chosen for designing the screen
layouts. They had considerable experience of using TV screens for their presentation
work, however, due to the high level of secrecy involved in the project some level of
misinterpretation had taken place.

The demo was to essentially show the STB working, to present the opening screen
and the various menus. The problem was that it was mouse operated, and the screens
had been produced from an authoring system which was still under development
which was extremely demanding in terms of computing power - it needed in the
region of 16MB of DRAM, 2MB VRAM. These kind of specifications at this time
far outstretched the capabilities of the Om box which was only going to possess 4MB
of DRAM, and no VRAM. Some contingency was needed due to these obvious
technical limitations. And this problem was passed to two of the designers, Terry
York and Chris Marshall. They worked on the IR control and the hardware problem
with memory, and eventually came up with a software design that displayed a series
of JPEGs according to a series of key presses.

Meanwhile another engineer worked on the MPEG hardware (the card from Wild
Vision), as well as writing the software necessary to play MPEG. The real problem at
this time was linking functions together all operated by an infra red remote control.
Due to its unsatisfactory nature, the CADSOFT demo was never completed as Acorn
withdrew its funding. However, building on the work that they had already done,
Terry York in developed the user interface which characterised the first STB design.

The way they worked was as interpreters and implementers of ideas that derived
from other engineers, managers, etc. They fed back continually into how the
interface was shaping. They had the CADSOFT demo as an initial working template
and this provided the main idea for on screen buttons and some of the backdrop

026463 567
patterns:

" . . . we thought right the first idea was to get their demo working in our new
environment we've created, the idea was to get their demo and stick it in, and
we did that within a week and we had it working, we had the IR working it
was all there...the idea was we had to sort of develop from there trying to
improve . . ."

A blue backdrop (which featured on many of the screens) was a CADSOFT


development however the original buttons were yellow and they had an icon-based
system that represented the service. The buttons were also dead (they did not
function) and it showed lists for movies rather than the small picture icons that came
to characterise the final, more graphically-based product. A further problem was the
slowness shifting from one screen to another. They saw that they needed a click and
some actions on the screen to provide some sort of visual indication that the system
was responding. This required that a sound player which would play any sampled
sound, and which had to work on the RISC PC.

They eventually found one but could not find many samples for it. By default they
ended up with the one sample that they had - a sound which sounded like a foot
loudly crunching on fresh snow. This innovation acts as a prime example of how
usability parameters are not fixed, but are dynamic and can change and evolve over
time. The team worked with this sound satisfactorily for some time before they found
it too irritating. Its purpose was to indicate a button press, however its sound became
dominant to its use (not domesticated, or sublimated into familiar use). It had to be
changed. They changed it to a, much less irritating and subtle typewriter click. This
also gave rise to the idea of visually highlighting the button - you pressed the button
and it depressed slightly. This also helped to make it more visible, and again, it
added to the user experience of acknowledging action.

Another standard usability ‘trick’ was employed - a fade between screens during
shifting between pages also developed the illusion that the speed between screens
were quicker. All such iterations were done on the bench where engineers tested
ideas out on each other. One of the designers developed the design of nine buttons

026463 568
indicating the nine menu options. Each ‘button’ became a picture representing the
menu item.
"He went round and found some nice pictures and stuff and scanned them
in...he did that and it kind of stuck...and we modified the other screens to kind
of look similar...we started putting drop shadows on things and certainly any
numbers we had we than put drop shadows on."

Much of the interface work was done using the RISC PC proprietary multimedia
authoring program. This was later to have crippling impact on the Cambridge Trial
due to the inability of such a system to accommodate other file formats that had been
created in house by service providers. During the early days of development a
number of Acorn titles, i.e. the game Technodream, which had been developed for
the RISC PC . . . and included this on the menu of the demonstration STB.

Providing technodream as the sole functioning games along with other ‘dead’ games
menu choices lent the impression that there was a whole library of games available
on-line. And games, as they do today, were intended to be a strong selling point. It
was understood that one of the shortcomings of the Apple STB was that it could not
run games. BT, who were conducting their own trial at the time were concerned
about this, as they were very much wanting to test i-Tv as a consumer market, and
realised only too well the commercial potential of games. Om wished to create the
impression that there was this catalogue of games ready to provide on-line playing.
However this was misleading because many of these games were not anywhere near
running on the system. With work they could get them running, but they wished to
hold back on precious development time until some concrete proposal came forward
which would merit the work.
" . . . To be honest its just a big con, the RISC PC hadn't been released and we
had about 200 games in Acorn the consumer unit which had copies of all the
games...and basically I sat down for a couple of days just testing them and
getting them to run on the RISC PC, because if it runs on the RISC PC it will
run on the STB we found about half a dozen which would actually run
without crashing . . ."

Some modifications were necessary. The games were designed to play via a
keyboard or a mouse and had menu options at the beginning which were
inappropriate for STB presentation and use. Another problem was a lot of games

026463 569
often do not quit properly making it difficult to get back to the desk top, and the only
way you can actually get out of them is to reset or reboot. This would be a very
negative option for STB use, either on the trial or with a demo.

Through default they ended up with Technodream, the only game which would work
with relative ease and reliability. A further problem, obvious in the operation of
games designed for keyboard use, was that had keyboard instructions - 'press z for
left' and so on. This was converted into word keys - on a main menu you press
escape to get out of the programme and to get back to the desktop. When the game is
running you press a different key, and the key 'normally' used for returning to the top
screen took on a different function.

York felt that clients would be impressed when they would see the relatively large
selection and choice of games, but understood little of the amount of 'hacking' it took
to get one game working. There were similar problems with the educational material,
which had been designed for use with pointers and mouse. As a small team, which
had to produce things under intense time pressures, meant that they did not have the
time to do the rounds of companies and to sign agreements and pay them to rewrite
source code. It was more realistic to 'top hack' something and get it get it working the
next day. Using third party developers for material in the early stages did not merit in
any way paying for development time. This was another crucial realisation of the
project - producers of educational material (or indeed, any kind of material) were
hardly enticed to change the format of their material (say, from CD-ROM to STB
compatibility) when there was no immediate financial incentive. While it may be
advantageous and a necessity to show-off technology to third-party developers, it is
difficult to sell the idea of them producing material unless there is a revenue ready
customer base.

1995

On the 30th March 1995, Peter Bonfield, the then chairman and chief executive of
ICL (who went on to become Chairman of BT) switched in the server for Phase 2 of
the Cambridge trial. The intention was that Phase three would develop out of this
towards the end of the year - when the trial would go commercial. STB 2 was

026463 570
finished around March.

Most significant was the 1995 business plan being made, not by orders of ATM
switched interactive television STBs, but STBs with modems (Viewcall) and those
with CD-ROMS (Lightspan). These contracts presented themselves via the publicity
generated for the trial and their impressions on encountering the STB demos. These
deals significantly altered the entire development dynamics and strategy within Om.
They showed clearly opportunities outwith the domain of expensive ATM switched
I-TV networks. They showed a much more easily attainable market and more modest
development trajectory manufacturing stand-alone CD-ROM players, and dial-up
Internet STBs which used a TV as a monitor. They placed great pressures on the
Cambridge trial – that is the networked trial - to become an autonomous entity,
almost a subsidiary of the Om development process (this is addressed more fully in
the next chapter).

At the same time however, the design team - consisting of three technicians-
engineers and a project manager - were turning their sights from the problems of
building an operational system to becoming very conscious of usability. Terry York,
for instance, was originally given a job at Acorn concentrating on evaluating the
RISC OS interface - ways to improve it for future versions. This included looking at
the interface to see what could be done not only to improve it, but also how to
incorporate new ideas for the system. This work was to influence a different release
for 1995, so they had about a year to conduct their project. Much of it was based on
comparison of RISC OS with other systems; Windows, Macintosh, UNIX, all the
different X Windows, etc., cataloguing what was good and worked, and what was
bad and inefficient, what RISC OS were missing and what it was lacking. It was a
bench marking type of concept that was used to generate ideas.

There was many ideas being generated within the company. The design team also
talked with repairs, DBU - the educational unit, the customer services people who
deal directly with users as well as going out to places where users were situated,
particularly with educational centres around the country such as schools. Some local

026463 571
schools were directly involved in testing for the product. They spoke to some third
parties, such as software developers and others developing hardware and software for
the machines.

This wide and varied means of deriving product feedback was bringing to focus
some generic problems that were emerging from the conversations. They realised
quite quickly for instance during their field visits, the great need for printing options.
Supporting print has been one of the biggest technical problems Acorn has had for a
number of years. One of the team was going to look at printers, understand the
problem people were having with them, and improve it. Another, more technical
member of the team was going to do the necessary programs, he had been the main
developer of the RISC WIMP. His job was to develop windows and graphical
interfaces. The other technician and the manager filled in any gaps by doing 'bits and
pieces'. This experience was that which formed much of the early usability thinking
regarding STBs.

One issue that arose, and one which is a tenet of usability engineering is consistency
(e.g. Wiklund, 1994, Neilsen, 1993). Interface designer took it upon himself to be
responsible for ensuring consistency in the interface;
"I was really quite keen [on usability] because I knew nobody was
interested...I had been working on this thing for a year, it hadn't...it had gone
somewhere but we hadn't produced anything...I thought we've got all these
things we've learned, we know what we can do lets make sure...I'll make my
job to keep it all consistent. . ."

Whereas during the RISC OS project the usability team would report 'well its not that
easy to use' Acorn managers and other creative staff would retort 'its always been
easy to use' and 'make it easier'. The atmosphere was described as one of 'fighting all
the time' At Om however, there existed opportunities to voice opinions and also to
suggest entirely new ways of doing things, 'that nobody's done before'. Rather than
this climate being symptomatic of a 'permissive' and 'creative' atmosphere, it was
suggested that it often came about through default. Everybody at Om was so
involved that nobody, that there were periods when feedback was sparse. No one had
time to criticise, and this left room for artistic licence from the usability-design team.

026463 572
With one person or unit 'in charge' of design lay open the possibility of implementing
usability knowledge and laying down certain criteria which could be kept consistent.
"I could actually go in and bear in mind at all times that every scheme that we
did there was some reason why we were doing it...I could actually spend
hours going through all the screens and looking at each one and picking even
little things out which weren't quite making for a more consistent flow."

Terry York picked the font and style of the screen text and maintained this across all
the screen headings. The positioning of certain pieces of text, such as that indicating
navigation were kept consistent. The main usability focus was on consistency,
however a large degree of attention was paid to complexity, particularly making sure
that the navigation was balanced and natural. Navigation is balanced when one does
not have too many links and avenues down one side of the 'tree' – the potential
avenues of navigation. The idea is to spread out from the link that has been chosen
by the user. Only features that are accessed regularly should be at the higher levels.
Also when the user encounters an unfamiliar screen or screen function, it is
important that when you press a button you know what is going to happen.

When the original screens were created, Dave Swallow approved the design and
endorsed that this particular style be used in all the screens. New features or content
were added as ideas emerged from all members of Om. The usability-design team
collected ideas from what everybody in Om were trying to do, or by feedback they
would give through discussion. While the influx of ideas and comments can lend to a
useful exercise in innovation, too many comments coming from too many diverse
questions can amount to chaos;
"I mean things happening all over the place and you really didn't know where
something would come up next...and you just had to...[Dave Swallow] would
come along and say 'well, look we need a game or we need . . .' Anglia came
long and said 'here's some of our CDs put some of these on' and things like
that...sort of keep things on there and add things as they come up. . ."

The task of the usability-design team was to marshal these ideas, implementing them
in terms of a consistent presentation. More than this being simply aesthetic and
ergonomic in nature it also required the technical skill of building in top level
compatibility with the system as it was developing. The work paid off, in-company

026463 573
feedback that was attainable was positive. People were saying that it looked good,
and felt really easy to use. The team felt like that they had done a good job, however
they felt that: “it was kind of like you really, don't really, know why”. This doubt
appears strange in the light of the conscious application of usability knowledge, but
suggests it arose from the concentration aesthetic concerns, and the use of ‘found’
materials (i.e. off-shelf components).

The 'social constructivist' approach to screen development was a problem. Also the
insular attitude of the design team, patriotically in their struggle with the 'outside'
criticism (i.e management) of their work, perhaps worked against them criticising
each other:
"Each person was doing their own thing, they felt like they could put their bit
in and I think it became difficult because we never really sat down and said
what we were going to do...and I was kind of taking control of it and so when
we had these screens appearing it was 'yeah, but its not quite right'...I don't
mind, its difficult because you can't tell somebody you don't want their input
because there's nothing wrong with their backdrops there actually quite nice
we'd rather use them but it's the different style."

There is some suggestion that towards the aesthetic side of design process the
certitude of 'proven usability research findings' fell away to the more subjective 'free
for all' of the artistic and creative domain. This highlights the ‘muddling through’
that characterises much design practice. It is learning by doing but also learning by
struggling (Arrow, 1964, and more recently Rosenburg, 1982 with 'learning by
using'; and in relation to problem solving in Von Hippel and Tyre, 1995; and
'learning by trying', Fleck, 1994,' learning-by-interacting' Andersen & Lundvall
1988). This undermined any claims to an overarching and dominant intelligence,
focus and logic that should govern the look and feel of a final product. It became a
matter of expressing opinion rather than concrete suggestion regarding any definitive
way of doing design. Under reflection and analysis the final product, and indeed its
success were reported as somewhat arbitrary. This, in itself, created insecurities
regarding the integrity of certain decisions. However, through learning from people
(i.e. Om personnel) actually using the system, some important lessons were also
learned:
" . . . we've got no reason to say that these blue backdrops were the best

026463 574
backdrops to have...those 1 to 9 labels best to use, but that's kind of well that's
what we're going with therefore we stick with them...when we were
developing at the time we had situations where we have a screen with menu 1
to 9, and behind that we'd have games at 1.0.6, and we only have 6 now. . .
for some reason the games logo seemed to be long and thin, it was stretched
to go on there...basically we thought we'll have them 1,2,3,4,5,6 . . . and we
got feedback saying 'look hang on, it looks really strange looking down the
left hand margin and going 1,3,5, and its not really quite clear...so we
swapped them round and went 1,2,3,4,5,6, and it was still not very clear
because it should have been 1,2,3, - 4,5,6, but even so it was still felt that it
wasn't quite right, it didn't feel right and that's hen we eventually went to
1,2,3,4,5,6, basically its all to do with the button layout on the remote...this
was eventually designed for people to move round with the arrow keys, but
nobody used the arrow keys because it was too difficult...it was more of a sort
of that sort of thing, if you want something you just press the number . . .so
therefore, we basically found that if you're going to do 1 to 9 you got to have
them in order 1 to 3, 5, 6, . . .we kind of learned from people using it, which
was the approach we did."

This clearly shows how the logics of design operated here. And more than this, how
necessary they are in initial product formation. The people using the system were
company personnel as well as representatives of the technology partner's personnel.
For instance when the CT began with Phase 1, its ‘trialists’ were comprised almost
entirely of Om staff, including the Terry York. Major iterations were also made
during this period. For instance, on returning home one day and switching on the
system York realised that, from his sitting position in his living room he could not
read the text.
" . . . the text wasn't big enough, I was sitting on my sofa and you couldn't
read the typeface...partly it was to do with the dark I mean all these backdrops
on here are faded out slightly as well...because it was too bright there's too
much pasture in the white in the backdrop...which you won't notice if we
watch it here [points to screen] but as soon as you're sitting in your front
room you couldn't actually read it...that's an effort that we made...maybe I just
noticed and I just changed it...and nobody said 'oh, you've changed the
backdrop' because they never even noticed but its those kind of little things
that go on in the evolution of the product, something that nobody even
notices."

This is directly representative of a mainly technical contextual usability problem


realised through situated use of the system in the home as opposed to on the
workbench, even though it was not made by a ‘typical’ end-user. Nevertheless, it
highlighted how change in context, in this case the geographical movement from

026463 575
workbench to home, provided major clues as to major inadequacies of the intended
system. This realisation led to them re-consider their development of an American
NTSC version of the STB, due to the inferior quality of the display, and moreover, to
consider how size and resolution of text was a critical factor in the legibility of
content in domestic locations.

Management realisation of such design problems, as well as other issues such as


copyright for graphics, brought to the fore an emerging need for new interface
expertise - a person that had graphics expertise bounded with HCI knowledge. While
it was possible for the existing staff to grab screens and text from ITN news or other
broadcast information, creating icons and other original material remained an issue.
This was again a ‘gap’ considered outside their present skill base. The interface as it
stood had plagiarised visual information from a variety of sources, all of which could
also have some copyright claim against the company. They needed someone who
could design buttons with original icons.

With the addition of such expertise, Terry York’s position evolved into being entirely
responsible for the creation of demo STBs. However, the incorporation of an 'expert'
in interface design was not entirely welcomed by those who had previously been
responsible for interface issues.
" . . . you sort of think 'what's going on'...Nigel did some screens in a week, it
was kind of like, yes they were fine but also loads of rules were broken...the
rules which had been set up for our demo...nobody had real written...I hadn't
written them down...but all of a sudden the screens were altered....he done
one where, the news screen and there was going to be BBC and CH4 and
ITV...while we go into different types here there was an intermediate screen
which had different producer news...well there was a small logo down the
bottom there and I thought why that?...because it was never written down
there were no style guides, no standards...how its worked over the last year
and half I've put something together virtually therefore I've been in the
background all the time saying there's got to be a reason why its
different...OK if you're going to have different shopping screens and there's
going to be a style or when Acorn do their production lines over the system
'well that's you're area, you leave that and you do you're thing there' well its a
different backdrop."

The mismatch of Icons and screen styles led to an inconsistency in the interface

026463 576
between pages. This is widely recognised as a usability problem, especially for
novice users.

The development of demos

What eventually emerged was a tension between demo development and design, and
the design of the interface and other technologies intended for the actual [networked]
trial. Design and development for ever more elaborate demo STBs required its own
resources, it focussed mainly upon interface and navigation. The actual trial system
posed its own unique problems which were mainly network-based, and of a
technological nature. Both these development trajectories demanded and split
resources. In some quarters of Om primacy was given to the main screens, the
visually generic part of the system which should remain consistent and stable. It was
further felt that this should be so even in the light of the threat that future commercial
users of the system may wish to bring their own trademark identities, styles and
operating principles. These would need to managed, properly and stylistically
integrated within the pre-defined structures.

Much of the learning regarding the interface remained tacit knowledge. It was clear
that some aspects of the interface were developed through default and nothing had
been formalised or written down. No official guide was developed, as was
developed, for instance, for applications written for Windows or the Apple MAC.
More than this there had been no formalised user-interface meetings so far. This
created deep insecurities with respect to how dominant the layout would be in
‘closing’ some creative options for service providers, while at the same time
‘opening’ others. Service producers were already ‘closed’ in their choice of
authorware. They would have to use a RISK PC to create content. They would also,
however, obey the style rules for layout according to the interface. Om staff became
very aware regarding how 'good' screen design influences sales:

" . . . the good that happened was that the user-interface was strong and you
don't know that what you've done sells those boxes...I mean we took a demo
out to the states, the standard sort of demo and we got this big order how
much of that was influenced by the fact that they really liked what we had

026463 577
done on the screens...you can't evaluate it there's no other way."

In 1996 Acorn developed the networked computer out of the STB design. This was
in Internet access device that used the television as a screen. In September 1998,
Acorn announced that they were withdrawing altogether from the desktop PC
market, and concentrating on set top boxes for digital television.

Conclusion

As a tool of analysis, sociotechnical constituencies considers why a technology 'is as


it is' at a certain time within its development cycle, and how it reached this stage, and
relies on some understanding of where its origins lie, both from its technological and
social perspectives. The acceptance and domestication of such products are also
conditioned by a complex of influences including such elements as the symbolic
attributes of the product group. These symbolic elements are afforded sense through
the propagation of myths - utopian or dystopian discourses, marketing and PR, and
moral panics (via marketing and/or press reports). These always accompany products
in their diffusion through to consumption and use (see fig. 5.4 below). Further, there
are the tensions and resolutions regarding how these are situated within the pre-
existing networks of social and technical relations and use functions within the
household, as well as lives, lifestyles and everyday practices of potential consumer-
users.

Industrial/market level

Pre-existing technologies (‘off-the-shelf’)


Intra-organisational level

Wider
Technology developments
cultural, New forms of
social, expertise
market and
consumer
trends
Projections of
market/users/consumers T Organisational
expertise
Potential
partner
organisations
New idea or for alliances
product concept

Existing technological environment and markets

026463 578
Fig. 5.2 The sociotechnical constituency at the genesis of a project

Applying the above model to the Om case described later, the pre-existing
technologies were the RISC PC developments (motherboard, ARM chip various
prices of software), Kryton project (casing and power supply), ATM network card
etc. The technological developments included software linking the various
component parts together, work on the infra red controller, the interface, and the
memory. New forms of expertise included specialist software and hardware technical
personnel, who were added as the project gathered momentum. Partner organisations
included the companies who were involved with the system technologies, and those
who later came in as part of the 'service nursery' – a term coined by Om managers to
refer to the 'safe environment' where content and service providers could learn about
the potentials of the new technology. The product concept was fuelled by the
companies' involvement in other technologies. This drew attention to the
technological potentials of their equipment that led to the genesis of the project (new
product concept). And last, it was felt that the time was right for such a technology,
and that there was indeed consumer demand for such systems on a very large scale.

It should be stressed here that these events or components of the constituency were
not only dynamic, but also interactive and interrelated. For instance, the need for
technological development (understood by the 'gaps' in the system when it initially
ran) drew partners (SJ Research to augment the technology of ATML). Pre-existing
Acorn technologies, such as their presentation software - fuelled the initial interest of
News Datacom, which in turn was the initial spark which directed the entire project.
News Datacom had obviously interests in such technologies because of their feelings
that there were to be radical changes in the media marketplace, which would demand
new technologies and expertise and so on. Each element links in a logical way with
the other in the analysis of a given technology development situation.

026463 579
There are several points or themes that are of particular interest drawn from the story
of Om's development out of Acorn. Summarised these are;
 The obvious configurational nature of the Om product and the
development process. At times it was difficult from the outside
perspective to say what the Om product actually 'was'. Its apparent
'openness' to interpretability was something which became recognised as
a strength rather than a weakness - particularly when Om made their 1995
business plan selling non-ATM, non-MPEG decoding boxes to Viewcall
and Lightspan. Such events destabilised internal interest in the trial, and
in general company orientations.

 The distinct design environment in which Om designers worked,


including something of the particular design logics (as opposed to user
logics - suggested earlier) which were apparent in Om.

 The compromise which lies between creating a product which has


industry wide compatibility, while capitalising and maintaining the
innovatory identity of the product. For Om they wished to create a
product whose technical specifications concurred with the maximum
amount of standards and conventions. However, they wished to maintain
aspects of the system (i.e. Acorn proprietary authoring software and OS)
which would require either the purchase of RISC PCs and/or consultancy
for any serious content and system architecture development work.

 The use of demonstration technologies as catalysts for funding


opportunity, project acceptance at the level of the firm, and the wider
market of potential client organisations, service providers, and third-party
developers.

 The sort of relationships, which are happening within the sector - i.e., an
unpacking of the so-called converging alliances of technology partners.

 A striking example of how design processes which are divorced of end-


user-consumer input can lead to changes being made by the firm which
may lead to less effective and systematic results (i.e. the value placed on
attracting a 'hybrid' expertise to create new interfaces).

 The spontaneous and more intuitive way in which results were found, and
how these came to be seen as lacking value and need of a fix (such as
importing hybrid forms of expertise to upgrade the interface).

 The ruptures between marketing and product development within a


company culture dedicated to maximising growth, manifested partly by
numbers of personnel (i.e. more marketing personnel than technology
developers) and due to lack of understanding of competencies and
capabilities (of marketers with respect to technology development).

026463 580
 There seems to be some contention with respect to what the Om product
actually was. The STB was continually evolving with respect to user
needs and requirements.

An important point which I would like to draw out was the separation of
development activities between technologies for the Cambridge Trial, the production
of demo STBs for business visits and trade shows etc., and the alteration of boxes for
clients needs. Each of these required, and received portions of the development
activity at Om.

Each strand of development work carried with it different communication potentials.


The Cambridge Trial evolved from its inception, quite obviously, into a much more
explicit PR exercise. Its operation and the publicity drawn from showing the demo
boxes had driven interest from a number of companies to Om's STB, and particularly
to its attribute as being a cheap alternative to a PC. As Gregory (1965: p.16) noted:
"Design is concerned with making things that people want; with building up patterns
which have value." The original intended features and functionalities (i.e. MPEG
decoding for VoD, and ATM networked) which differentiated it from the PC, and
accented within the Cambridge Trial publicity, probably epitomised the kind of
elements these companies wished to cut back on. Also, the interactive industry as a
whole was moving towards (cheaper?) alternatives for delivery of services - such as
digital satellite (and also ADSL was not being discounted).

It would appear that the design team, or no one else for that matter, manifested
knowledge of other interactive television trials, or even of competitors STBs. They
were strong in knowledge, however, regarding more immediate innovations in
component technologies, and in terms of more general innovations such as digital
television. One of the few direct comparisons that were conducted was with satellite
decoder boxes, as benchmarks for the robustness of consumer electronics. This was
important for realising STB manufacturing and production specs. They had also
bought a couple of CD-i players, and a 3DO machine to see what components they
used with a view to keeping costs down. Much of this sort of benchmarking and
reverse engineering was the designer's responsibility.

026463 581
The interface layout and design was entrusted to a third-party developer, CADSOFT
Ltd.'s. And some of the failure in communicating the functionality of the i-Tv
concept cast light on the potential problems of communicating the ingenuity and
potential of the system to other players and organisations. Also, varying
commitment, self-interest, and potential benefits influence the relationships within
inter-company alliances, as well as colour the visions and perceptions of the
successes and failures of trials. Cambridge Cable, whether through being an active
operation, already with a subscriber base and a revenue stream, placed an altogether
different premium on i-Tv experiments than did ICL, ATM and Om.

However, there were also problems within Om itself regarding the agile and
responsive nature of the Om product development process, responsive to feeds
coming from the marketing division. The product development manager expressed
his reticence with the relation of the sales and marketing team making claims to
potential and actual clients that were difficult to manifest in terms of technological
development capabilities:
"One of the flaws in this organisation, which I've expressed at several
instances is that we have a commercial group which decides to go off and try
and sell product, no matter what. And I understand that, especially in the lack
of number of sites where trials are taking place and all the rest of it and you
have to commit to things that you normally wouldn't commit to, or things that
aren't quite what you're doing. But what there never is then is a reality or a
feedback to somebody in engineering about what it is they say they're going
to be able to do. There's more a commitment on the commercial business to
say "Well do it and then it gets thrown into the pile with the rest of it in the
belief that everybody can turn circuits and actually make things happen.
That's one thing that I'm actually disappointed with this place (Dick Francis -
The Product Development Manager)."

Regardless of the attitude that there was some non-alignment of sales with
development, others felt that the sort of relationships that existed and the way in
which business was done was the right way;
" . . . Its all very well launching a company and then saying year later here's
our first product . . . what we wanted to do was launch on July the fourth . . .
launch with a product which not only looks like something but it actually
does something as well . . . so that the aim from the very early stages, from
about April, was to produce something we could launch with as Om . . . so

026463 582
they said start putting something together for that . . . there was lot of
problems going on . . . for example we had problems with the box taking two
or three minutes to actually start up and things like that, there were all sort of
configuration things going on we didn't really quite know how it would all
work, we were learning as we were going along."

This mode of working - learning as you go along - has been variously referred to as

'learning by doing' (c.f. earlier references by Arrow, 1964, Rosenburg, 1982; Von

Hippel and Tyre, 1995; Fleck, 1994, and Andersen & Lundvall 1988). MacKenzie

and Wajcman (1985: p.10) described this as "feedback from experience of use into

both the design and way of operating things."

One aspect of the technology development which could not be learned by trying was
the development of the chipset for the second Om box, whose development was
began before the trial, and essentially Om began. Other problems lay in the particular
configurations of the STB components themselves, and the STB itself as a
component technology of the whole system.

Another important point was the way in which changes were made to the interface.
The interface development was viewed by management as an arbitrary affair as
compared to the technical specification of the box components. Perhaps this was due
to, or a symptom of, the artistic or creative content that shapes how interfaces look,
feel, sound, and operate.

Lab-based usability testing 'freezes' a situation of use in time, in order to evaluate the
features and functions of a technology. However, the design process looked on as an
iterative process - where iterations can take seconds, weeks, months and even years -
is also a process of developing usability from the designer's perspective. The
example of the sound, initially deemed useful, and latterly deemed a 'nuisance' is a
sure example of how even from a designer's point of view usability, or the utility, or
even appropriateness, or affability of a designed feature can evolve over time. This
raises an immediate question, is this the same for users?

026463 583
Within the technical environment there seems a tendency to rationalise
developments. This may be due to the way in which increased specification (often
symbolised by numbers) is placed against economics (symbolised by figures) before
any creative work is attempted. When work is done it is to accommodate the
physical, electronic, or functional aspects of the increased specification into a
technology, which is built, of existing and potentially parts that physically,
electronically, or functionally do not fit. In many ways such an outlook seemed to be
applied to the drafting of personnel, and the appraisal of the creative/artistic elements
of the interface. Management rationalised that if a 'techy' could 'knock-up' decent
interfaces, someone with a hybrid experience of HCI and graphics would be able to
create something even better. Such an attitude was similar in respect to the
development and building of the content and services capability outline in the
following chapter.

A final point which is perhaps not emphasised in the interviews, but which Hardware
designer made a comment (the 'alarming' rise in the number of staff), is worth
mentioning. On my first interview to Om's HQ, there was a large concourse which
looked somewhat empty. Apart from a large amount of packing cases, and several
desks. Over the next year there was a continuous expansion in the number of desks
and personal, and there seemed to be continuous installation work being conducted in
the back ground, such as the fitting of wiring and cable. At its peak, the building was
almost full to capacity, and then throughout 1996 began to dwindle. By the official
end of the trial 'Om' had already moved back to Acorns main HQ, and the last
contact phone call was with a single individual who had the responsibility for tidying
up the remnants of the trial. As of 1997, Om still existed. However it is not a separate
operating 'division' but rather a different 'hat' of Acorn. The development work, as
well as the perpetration of visions, and the build-up of expertise, and its demise all
happened within a 3 year time frame.

026463 584
026463 585
Chapter 6 – The Cambridge Trial and Service Nursery

"This instrument, television, it can entertain, it can inform, yes, and it can even
inspire. But it all depends on the will of the humans who operate it. Otherwise it is
just lights and wires in a box." Edward R. Murrow

The Chinese sage Chuang-tzu, who lived in the fourth century B.C., relates the
following story:
As Tzu-gung was traveling through the regions north of river Han, he saw an old
man working in his vegetable garden. He had dug an irrigation ditch. The man
would descend into the well, fetch up a vessel of water in his arms, and pour it out
into the ditch. While his efforts were tremendous, the results appeared to be very
meager. Tzu-gung said, “There is a way whereby you can irrigate a hundred ditches
in one day, and whereby you can do much with little effort. Would you not like to
hear of it?” Then the gardener stood up, looked at him, and said, “And what would
that be?” Tzu-gung replied, “You take a wooden lever, weighted at the back and
light in front. In this way you can bring up water so quickly that it just gushes out.
This is called a draw-well.”

Then anger rose up in the old man’s face, and he said, “I have heard my teacher say
that whoever uses machines does all his work like a machine. He who does his work
like a machine grows a heart like a machine, and he who carries the heart of a
machine in his breast loses his simplicity. He who has lost his simplicity becomes
unsure in the strivings of his soul. “Uncertainty in the strivings of the soul is
something which does not agree with honest sense. It is not that I do not know of
such things; I am ashamed to use them.”
Introduction

The previous chapter chiefly addressed how Om emerged out of Acorn, and how, in
turn, the STB, its interface, and initial attempts at content, emerged from Om. This
chapter addresses more specifically the development of trial itself, and looks at the
way in which Om approached the development of content. It then focuses
particularly on the perceptions of the senior manager responsible for the development
of services. He was the product champion or, at this stage in the development the
main constituency builder.

The Cambridge Trial - the technology

The Trial's original objective was to explore the: "enormous new opportunities for
lifestyle management offered by interactive multimedia, or more specifically, by
interactive TV."418 These opportunities would be facilitated by the provision of a
range of services aimed at addressing specific aspects of everyday routines, activities
and relationships.

As will dealt with in a later in the chapter, this aim cast many visions and
visualisations in those involved in the project. These drove and shaped the
development of the technology and services. A simple example, offered by the
product development manager at Om, was home based shopping through offering a
'generic shopping basket'. A standing order could be placed for essential items -
washing up liquid, dog food, cereal etc. - this would free up leisure time for other,
perhaps more enjoyable activities.

While such a proposition may now be familiar with the rise of Internet-based
supermarket brands, it should be noted that at this period the graphical WWW was
only beginning to manifest the first commercial websites. These were little more than
advertising sites rather than full-functional e-commerce sites. As such this was one of
a large number of ideas put forward regarding potential uses of the system. Its
418
Derived from Acorn’s web site http://www.acorn.co.uk/aom/trial/phase3.html (update 28th
October 1996). A press release of the time shows how others (in this case ICL) envisaged the lifestyle
impact of this new technology.
‘openness’ as a system elicited considerable brainstorming regarding potential
applications. The result of such brainstorming, also, as was documented in the
previous chapter, gave rise to ideas on potential service provision partners – it
showed ‘gaps’ in an emerging idea of a full service network.419 In this case there was
the need for a supermarket.

Linked to this [consumer] vision was yet another vision addressing matters of a more
logistical and supply nature. It was viewed that supermarkets, already distributed to
suit urban areas, would reconfigure their ratio of retail space to warehouse/storage
space. In effect they would, rather than places where people travelled to shop, act as
warehouse/distribution centres for the home delivery of i-Tv ordered goods. They
would incrementally adopt this role as i-Tv diffused into homes and its use became
more widespread.

The Cambridge trial was the showcase of Acorn's movement into the area of 'TV-
centric™ technologies'. It ran parallel to the organisational development of Om. It
was intended to show the possibilities and potentialities existed for i-Tv in domestic
homes. The trial successfully illustrated both the competencies and limitations of Om
and Acorn, their technology, and their ability to foster and maintain essential
partnerships.

As will be illustrated in this chapter and the next, the trial placed its own pressures on
the firm, eventually calling for a need to make the trial autonomous, in both a
development sense (particularly with regard to content provision) and an economic
sense. The trial moved from being a concerted effort on behalf of the entire
operation, to one that was the responsibility of a small group. There was a split at
management level with some quarters of Om staff clearly in favour for the
Cambridge Trial as a mechanism for generating attention and prestige to Acorn.
They viewed it as valuable in drawing attention to the firm and its competencies: "its
nothing to do with the fact that we have a STB, its because we have a trial where we

419
This was the title for a US based I-Tv trial ran in Orlando. However I use ‘full service network’ to
suggest a scenario of suite of services which are complementary and mutually exclusive – if you want
a microcosm of the high street – bank, supermarket, post office, shops etc.
can show it working in a real trial." (Terry York - Interface Designer)

However there arose others who held the opposite view – that it was commanding far
too much expenditure and development effort. This later group viewed that it could
damage the entire company financially. This group were preferring instead to
concentrate on more immediate and less ambitious development opportunities which
had arisen (partly out of publicity arising from the Cambridge trial. This was a
powerfully felt rift, with those pro the trial expressing that without the trial, Om had
"nothing."

While the original planned broadband services placed a strong emphasis on (the
technically sophisticated) video presentation of on-screen data and information, what
emerged by phase three was a (much less sophisticated) system and service. This was
operationally and presentationally more reminiscent of using the World Wide Web
made available via a television screen. This change in technical and operational
orientation was related to a number of different organisational pressures on the
business and technology to which Om were subjected.

CITVIC and the technology of the trial

As mentioned in the last chapter, Om received venture capital in spring 1995. One of
the criteria in making the money available to Om was that they trial their technology.
As entry into the BT trial was impossible, and discussions with other companies who
would stage trials were in an elementary stage, it became clear that the only way in
which to fulfil the criteria for funding was to propagate their own trial.

By early 1995, the consortium (Cambridge Interactive Television Infrastructure


Consortium - CITVIC) of technology partners comprised Acorn Online Media, ICL,
Cambridge Cable and Advanced Telecommunications Modules Ltd (ATM Ltd):
 Acorn Online Media - set-top boxes and systems
integration
 ICL - large-scale media servers and network management
 Cambridge Cable - the network provider
 ATM Ltd (ATML) - medium-scale video servers and
ATM technologies

The Cambridge trial system configuration (at the time before deployment of what
was eventually termed phase 3 - the NC) comprised a fully switched digital overlay
Asynchronous Transfer Mode (ATM) broadband network designed around
Cambridge Cable's existing fibre-optic infrastructure. This provided a bi-directional
link to each user using optical fibre to the kerb and then standard coaxial cable - the
type of wire used for television aerials - for the last few metres into the home. In the
home itself, Om's intelligent digital set-top box provided the interactive interface via
the customer's TV set. ATML's 155 Mb/sec ATM network Virata Switches route
data to ATM access switches housed in existing kerbside cabinets in a fibre-to-the-
kerb (FTTK) configuration.

The video server used in the Trial was ICL's large-scale Parallel Interactive Media
Server (PimSERVER - which could support a user population of 7000 connected
homes, with around 2000 homes accessing the service continually). The server is
fully scaleable up to a multi-terabyte storage capacity, and serves all current users of
the Trial with the same content or service simultaneously. Video servers and other
content servers need not be located centrally. High demand or local interest content
and services are distributed at various points around Cambridge Cable's ATM
network - and beyond - providing a highly responsive and manageable on-demand
system. All the Technology Partners are actively involved in contributing to evolving
standards through international bodies, such as DAVIC, DVB and the ATM
Forum.420 The Trial infrastructure was based entirely on European-developed
420
The Digital Audio-Visual Council (DAVIC) is a non-profit Association based in Geneva,
Switzerland, aimed at promoting the success of digital audio-visual applications and services based on
specifications that maximise interoperability across countries and applications/services. DAVIC is a
strong endorsement of its vision of a digital audio-visual world, where producers of digital audio-
visual content can reach the widest possible audience, users have seamless access to information,
carriers can offer effective transport, and manufacturers can provide hardware and software to support
unrestricted production, flow and use of information. Established in August 1994, the DAVIC
membership currently includes more than 200 companies from more than 25 countries around the
world and representing all sectors of the audio-visual industries: manufacturing (computer, consumer
electronics, telecommunication equipment) and service (broadcasting, telecommunications, CATV) as
well as a number of government agencies and research organisations.

The Digital Video Broadcasting Project (DVB) also includes over 200 well known organisations in
more than 30 countries worldwide. Members include broadcasters, manufacturers, network operators
and regulatory bodies, committed to designing a global family of standards for the delivery of digital
technology and much of the equipment that was used was based on the low-cost,
low-power consumption ARM 32-bit RISC processor, including Acorn Online
Media's digital intelligent STB. Later SJ research, another Acorn-linked company
entered CITVIC with particular expertise to augment ATML in the switching
technology.

Cambridge i-Tv Trial Services

As detailed in the previous chapter, having eventually proven that the technology
could be made to function reliably and consistently, the Cambridge i-Tv Trial
technology partners looked towards providing real content and services on the
network. This was achieved by attracting major media, finance, retail and
distribution companies with interests in interactive multimedia. These firms were
interested in the potential of networked multimedia, especially how it may affect,
enhance and impact their business.

The enticement to participate was that the trial represented a safe, relatively low-cost
environment for rapid interactive service development. The 'service nursery'
comprised of a framework of systems and procedures established to make the
process of interactive service development and evaluation (according to Om
paraphernalia) as "painless as possible." It was further claimed that the members of
television. DVB standards are developed for each delivery system, a set of User Requirements is
compiled by the Commercial Module. These are used as constraints on the specification. For example,
in the case of DVB-T, user requirements outlined broad market parameters for a DVB-T system
(price-band, user functions, etc.). The Technical Module then develops the specification, following
these user requirements. The approval process within DVB requires that the Commercial Module
supports the specification before it is finally approved by the Steering Board.

DVB-compliant digital broadcasting and reception equipment for professional, commercial and
consumer applications is widely available on the market, distinguished by the now instantly
recognisable DVB Logo. Numerous broadcast services using DVB standards are now operational, in
Europe, North and South America, Africa, Asia, and Australasia. DVB standards are open and
interoperable. Following approval by the Steering Board, DVB specifications are offered for
standardisation to the relevant standards body (ETSI or CENELEC), through the
EBU/ETSI/CENELEC JTC (Joint Technical Committee), the ITU-R, ITU-T and DAVIC.

The ATM Forum is an international non-profit organisation formed with the objective of accelerating
the use of ATM (Asynchronous Transfer Mode) products and services through a rapid convergence of
interoperability specifications. In addition, the Forum promotes industry cooperation and awareness.
The ATM Forum consists of a worldwide Technical Committee, three Marketing Committees for
North America, Europe and Asia-Pacific as well as the Enterprise Network Roundtable, through
which ATM end-users participate.
the service nursery have derived extremely useful experience, data, relationships and
international PR from their participation.

As of June 1995, the service nursery participants or Principal Service Providers


(PSPs) included:
 Acorn Computers Ltd.
 Anglia Multimedia - part of the media group MAI.
 BBC Education - broadcaster.
 BMP DDB Needham - advertising agency.
 Tesco Stores Limited - the UK grocery retailer.
 The Post Office - world-class deliverer of goods and services.
 IPC Magazines - the UK consumer magazine publisher and part of Reed
International.
 ITC - regulator and strategic technology developer for UK commercial
TV.

As of 1995, the Cambridge Trial could then be roughly split into two separate
constituencies; the technology partners - CITVIC, and the content and service
providers - known as the principal service providers (PSPs) - who formed the service
nursery (see below).
Fig. 6.1 The member firms of CITVIC (as of 1995) - Cambridge Interactive Television
Infrastructure Consortium (inner circle) & The Service Nursery (outer circle) (picture courtesy
of Acorn Om).

The development and thinking behind the Service Nursery

The person who was chiefly responsible for building the services and content of the
system was Marcus Penny. He officially joined Om when it first formed in July, and
moved into the Cambridge Technopark headquarters on the 1st of Aug, 1994. He had
previously worked for Acorn in 1983 where he ran the personal computing
development group. After joining PA Consulting he was involved in a number of
domestic multimedia system projects in the late 1980s. One of the projects involved
Shell, who placed one million CD-i players with telecommunications links into
people's homes. Multimedia players with telecom links convinced him that such
systems represented viable business propositions. This was confirmed by further
work with Philips Corporate and the BBC Board of Management.

He was also aware of the technological capabilities of the Acorn group (having
previously worked with Acorn):
". . . being aware of the technology that Acorn had and particularly the ARM
processor that was a very low cost intelligent set of technology building
bricks for constructing things . . . and being involved over the years in
looking at ways in which that might be used . . . and I guess three, four years
ago I became very aware that this opportunity [for domestic networked
multimedia] was on the cards and it could happen particularly due the
conditions in the UK being quite good . . ."

His feeling was that it was not only a technologically feasible project, but that there
was now a market ripe for i-Tv-style services. However, he was aware that it would
take much more than technical capabilities to drive a successful project. For instance,
the BBC project showed that they were already developing interactive media through
a dedicated department. They also had a number of other departments that were
investigating or producing interactive material.

It was clear to him that multimedia was going to become much more mainstream for
them as a medium, and there was strong evidence of a number of different ways in
which this would directly impact on existing core BBC activities. One of the aspects
that PA was exploring was: Who exactly were the disparate groups who were
working with multimedia within the BBC? The PA work with the BBC spawned a
number of responses including the production of The Net (a BBC2 programme on
digital culture and technology) and the BBC Networking Club (the BBC's Internet
provision):
" . . . people like the BBC had seen the effect of digitalisation on the
production part of the supply chain what we're not talking about is the effect
on the delivery end which is actually even more profound, far more profound
and it leads to a fundamental restructuring of the industry . . ."

In addition to the work at the BBC was Penny's involvement with the Federation of
the Electronics Industry (FEI) in the UK. He had been leading the multimedia
working group since 1992. This provided him with a rich variety of industrial
perspectives with respect on how i-Tv could develop.
"I became aware quite early on it would not happen through any one
organisation, but through a combination of things . . . and you really need a
whole set of skills all the way down the value chain from the entertainment
and service providers . . ."

Already clear from the Shell project were the significant business opportunities that
lay in the provision of services. However the wider industry perspectives were
suggesting entertainment was the way to stimulate needs and demand. It also
suggested the considerable technical problems involved in building a robust and
reliable system.

Business Case for Content and Services

The Cambridge Trial required not just one piece of technology but a whole suite of
technologies from the STB, to the network and server - a technological system (i.e.
Hughes, 1983) - which all not only have to work on an individual level, but
adequately work in concert. The Penny's view was that the technology was only one
'slice' of the entire project, "actually only the starting point." He was aware that there
had been a number of exercises over the last ten years which had struggled to reach
the point where it actually does work, and then had "it fallen down in heap."
"OK we've got a technology to work, you know, why doesn't it go as a
business . . . and I've reached the conclusion seeing several of these things set
up and then fail that actually the key to this was taking the service perspective
and getting that right . . . and that actually you have to the step of having the
bits of technology to make it possible at all, but then once you've got that the
way you've got to think about it to drive it is from the customer point of view
what do they want, that you can deliver, at a price they're willing to pay.
That's sufficiently different and new and I think that given the newness of
what we're doing is actually a rather significant step forward . . . just more
convenience of something we already do . . . "

A question remained concerning how could you make that work and then manage all
the logistics of delivering it as a sustainable business? Penny viewed this as
something where you have got to get the whole value chain working, so that
everyone [each partner and consumer-users] could envisage a suitable reason for
taking part. It would be a ‘full service network’ where all participants would interact
and provide complementary services. This required first and foremost desirable
content, as well as the creation of content supply chains that contributes all the
different bits of technology slipped together into an end-to-end solution. On top of
this was to sit the whole service chain that included supply chains for the delivery of
goods and services. Penny was adamant that it was the big picture - the entire
constituency - which needed forming:
" . . .what you are really talking about is the birth of an industry . . . unless it
all works together none of it works . . . take what we've seen over the 10
years, maybe 20 years is various false starts of bits of it . . . I think we're now
just at the point where enough work's been done in all the different are to
stand a chance of the whole thing flying . . . probably where the least work's
been done, that's where I've been concentrating on, the service aspect of it . . .
taking that particular perspective what does the customer want how can we
deliver it regarding everything else including content and infrastructure and
technology as a means to provide it . . ."

Over the previous couple of years, he had engaged in some conversations over with
Acorn regarding new avenues that they could be exploring. He viewed this as a
further influence in what cumulated as Om and the Cambridge Trial. These earlier
discussions were with the then technical director at Acorn – and revolved around the
opportunity he envisaged for a number of digital technologies which could be
developed, particularly within the UK, regarding networked multimedia, personal
digital assistants (PDAs) and other portable devices.
At this time the Penny himself was looking for a development environment that was
prepared to be innovative, prepared to be forward looking, and was used to the idea
of making investment in technology development. What was clear to him at this time
was that the scale of changes with respect to the 'digital revolution' was such that you
could not make an impression without making investment. There was not going to be
any benefit without a company being prepared to putting in considerable time, effort
and money.

From his consulting perspective he could see the changes that were happening, but
intrinsically, a consulting organisation is not geared to investment. Meanwhile Om
appeared to be looking for someone who had industry experience bound with a
service perspective. He was enrolled into Om on that basis.
". . . since then I've been exploring, we've all been exploring what we do with
the market how we make it work and understanding this complex questions
of how the services part relates to the technology product part and how to
make that work and what in a sense this example is . . . because that is
perfectly understood in industry . . . the general industry perspective is this
infrastructure and bits of technology and there's content . . . and to my mind
there's a long history of people who actually succeeded in fighting all the
barriers and all the difficulties that there are . . . there's many difficulties in
bringing technology and content together and it hasn't worked or its hasn't
worked to the extent that people thought . . . Prestel sort of worked and found
some niches like travel agents . . . but it was nowhere near the potential they
thought it was . . ."

While Dave Swallow may be regarded as the product champion for STB (detailed in
the previous chapter), Marcus Penny may be considered the champion of content and
services. Pragmatically, he would draw on his consulting expertise to work out how
content and service development could fit into the general Om business case. This
was problematic as it concerned what aspects of the product - services and
technology - could be viable at this time; a stage in which any market for i-Tv was
"very, very embryonic." Om were witnessing the opportunity to take its technology
and produce a product - the STB - that would meet the market need, if it did exist or
could be developed for i-Tv. His job was to anticipate, develop and forecast an
imaginary market.
Impressions of demo

Penny had originally started these initial discussions regarding services and content
provision with the fledging Om group in early 1994. This was when he first
encountered the early STB demo outlined previously in Chapter 5. Unaware of it
being a demonstration machine, he was, as was the Acorn management board before
him, deeply impressed by its appearance: "I thought, yes, that's actually what's
needed to make this fly from the technological viewpoint." The key features he
identified were the system's ease of use, its good usability. This, naturally, was
apprehension of the user interface – that most public aspect of the system which
represented the services, presented the content material and governed the way that
people interacted with the system (and the major focus of HCI and usability studies).
" . . . I think my subsequent perception of the people who had created it was
'here's a product, a piece of hardware . . . and this . . . what we've got here in
the user interface is just a demonstration of what the hardware can do' . . . I
think my feeling at the time, and I think its been born out through the
conversations we've had . . . a key part of the value of what Om has, is the
thinking that went into that user interaction . . . that's in a sense quite difficult
to see because in order to see that actually has value, you've got to look at it
from a services perspective . . . rather than a throw away piece of software."

The demonstration box convinced Penny that Om had indeed created something of
real value. Within it he saw a technological manifestation of his personal vision of
what networked multimedia should be (this relates very strongly to how the actual
features and functionality of a technology, congruent with anticipations, may
enhance its value or symbolic worth).

Interface

Marcus Penny viewed that the original user interface developed by Terry York, was
considered the "right expression" of the potential of the technology:
" . . . my experience of innovation is often the things that somebody does . . .
throws together in a couple of weeks because 'oh, that's sensible to do' . . .
and that has been better than some of the leading players in the world has
been recognised as the best example of a user interface that they've seen . . .
and it was just, you know, somebody or a couple of people just put it together
. . . part of the reason is that you are escaping from the mind set of previous
development . . . and part of the key thing is you've got to make this work
you've got to take the technology that's in PCs and bring a completely and
different mind set to it . . . and almost everyone else . . . everyone else in the
industry has got too much of the PC mind set or maybe its the TV mind set to
actually see with a fresh look, oh this is actually what's needed' . . . I think
that sometimes it a case that things are done and its not fully understood why
they're done and once you've got there you may not do the right things to
exploit it because you don't understand why its right . . . you need the fresh
creativity side to create it and then you see around it the understanding from
the business perspective is well the key elements in that is this, this , this and
this for these reasons . . . and that may need to be from a different perspective
than the perspective that created it . . . I think you could draw a map of how it
happened I think in most cases it was not consciously managed . . . a few
things that happened having in a sense happened by accident . . . I'm not sure
I particularly believe that either . . . when you look at it from a full enough
perspective . . . these were the things that were needed and different things
and different people contributed and the whole thing added up and had a
momentum of its own even though lot of the people involved didn't
understand at the time . . ."

Regardless of the Penny's sophisticated philosophies regarding the innovation and


reality of what had been created, this was not something which was consolidated
through management meetings, as he found it; "quite difficult to communicate
because it is a different mindset." He was referring here to the mindset that is
particular to different company functions;
" . . . there is also a view I find in technical environments . . . I used to be a
technical person . . . and now I think I'm more of marketing and service
person . . . from a technical perspective things are not valued unless they took
significant effort to achieve . . . and actually the things that are valuable may
not be things that appeared to take significant effort and technical skill and
pushing the technical boundaries . . . what an engineer says is 'I've done this
really difficult thing and that's it' and actually from the point of view of the
need the valuable thing may be the thing that was just thrown together in five
minutes of spare time . . . actually the thing which was thrown together in the
five minutes of spare time drew on all the accumulated experience, but
because actually doing it, its often not valued by the people how do it . . ."

After Penny's arrival at Om he took stock of what there was in terms of technology,
people and content. He compared this with perceptions of what was valuable
according to his consultancy projects at Shell and the BBC. This served to guide his
strategies of taking content and services forward and to develop them within Om and
the trial.

However, there were distractions due to the handling and relating of what was a very
dynamic technological phenomenon. As touched upon in the previous chapter, Om at
this time, through its marketing and the PR had drawn an enormous amount of
interest from a very heterogeneous blend of people and companies. Just simply
managing this interest in some systematic way, 'surviving' day to day constituted
major effort. It drew considerable resources simply managing the level of input and
interest. This impact was by no means one-way. Marcus Penny would see that people
would come to Om and leave with their minds 'profoundly changed'. Om had in
effect, clearly created an extremely charismatic product, fuelled by their visions and
rhetoric of what it could and would do. However, such a profile, cultivated and
handled properly, would lay the conditions for the success of the project. This was
certainly the view held at this time.

The content and services

Marcus Penny was optimistic that the development of the STB by mid-1995, and the
Cambridge Trial, had unfolded in the best way possible:

"I think the key to why I think that we're got something that's valuable here is
putting the STB together, identifying what the need was, creating a
demonstrator and lining the sights up to let it free run . . . and allowing the
right creative inputs to flow, not to be dominated by any particular
perspective . . . so something that's created that worked . . . the second thing
was setting up the Cambridge Trial . . . I think there had been a lot of
ambivalence concerning the Cambridge Trial and some understanding of why
is it here and is it valuable . . . the Cambridge Trial was one of these things
which was absolutely the right thing to do."

However, Penny came to be more than the product champion of content and services.
When the climate within Om towards the Cambridge Trial (due to the business
success of adapted boxes for Lightspan and Viewcall) – essentially splitting Om
management into two camps – one pro-trial and the other anti-trial, he also found
himself as champion of the Trial as an entity in itself. The main question regarding
the Cambridge Trial was the expenditure of a major part of Om resources on a
project that they could not envisage any immediate return from. Not only was this
felt within Om, this had already been the rationality held by several of the technology
partners for some time.

There had always been from the beginning of the project, some degree of imbalance
in terms of the commitment of the members of the various consortium members –
both in the service nursery and in CITVIC. Each had different levels of investment
and motivation. This was reflected in scepticism from some quarters, and lack of
drive in others. Om clearly had the most to gain from the success of the trial, but
nevertheless had to invest the most time, money and resources to make it happen,
and to drive it along.

Marcus Penny, however, argued that the two greatest assets that Om possessed with
respect to a global industry perspective was the STB and the Cambridge Trial itself.
He felt that these were unique and valuable elements, which together, differentiated
the company from others in the market for STBs. Indeed his very existence within
the project (as was the entire team who was involved with the trial's maintenance)
was validated only by a sustained focus on content and services.

Throughout the period marked by adversary at Om towards the Cambridge trial it


was defended by the Penny although at times he admitted to being somewhat isolated
and "sort of friendless." However he continually argued a pragmatic line that crucial
'gaps' in the system, or in expertise, would only ever be realised through the
experience of gleaned by conducting trials.421 This is a distinctive problem when
nobody has an individual prerogative ownership of the system. When it is a
consortium and a group activity learning can only arise when individual companies
execute their competencies and find (and can report to the other consortium
members) that something is lacking. Developing STBs without complementary
system components is rather like developing television sets without adherence, or
knowing the specifications of broadcasting equipment. By Om asserting 'ownership'
421
Interesting to note here is how the pragmatic arguments for keeping the trial going were based on
its ability to test the technical system. This contrasts with Marcus Penny’s feelings that the trial served
a much less tangible and symbolic role with the STB, such as distinguishing Om from other hardware
manufacturers making STBs.
or at least responsibility for the trial it was in a unique opportunity to capitalise on
the learning deriving from the entire implementation. He felt it was Om's strong
interest to be sole owner of the trial and its vast potential for generating knowledge:
"That meant that broadly no one was owning it and when I joined it became
clear to me that there was a commitment to launch on Sept. 30 th. . . the way
things were going there was no way we were going to get there so I actually
stepped in that point and did three or four things that needed to be done both
by us and stimulating and pushing several of the partners to do things to make
it happen. In that we've had to take on some things that in the initial scheme
of things that were not to be what Om were supposed to be doing."

However, the 'learning by doing' aspect of the trial had clearly dragged Om into a
number of different tasks of a predominately technical which they did not see
themselves to be responsible for. He stressed that the trial consortium was not a
conventional organisation. He saw as a further task for those initiating such
organisations was to create something of an understanding of learning how to work
together:
". . . [these kinds of business relations] I think, are going to be a key part of
this new industrial era we are entering . . . I think you have to learn how to do
it and its based on the situation where no organisation can possibly do things
alone . . . you've got to learn how to work with other organisations and
particularly work with an influence - people - over which you have no direct
power . . . it proceeds by practical evolution because this situation is so
complex that it is impossible to predict . . . you cannot gain enough inputs
and hold them in the same mental picture . . . so you perceive by gaining
some insights putting out some feelers, trying something getting the
feedback, tuning it . . . and then out of that you discover things that work . . .
and as you discover things that work you sort of look at them a bit more and
you understand them a bit more about them, and then more consciously apply
to that and elsewhere . . . It is a huge challenge to conventional corporate
culture because it just collides with traditional bureaucracies . . . it is re-
engineering, but the conventional approach to re-engineering is that you look
at it, you analyse it, you stand outside of it, and you re-design, and then you
impose a new design it doesn't work . . . as we used to say in PA phase 2 of
re-engineering is the rubble . . . it is re-engineering but it is re-engineering by
practical doing and exploration and testing a bit at a time . . . because you
can't throw away the organisation . . . because that's he only thing you've
got . . . you've got to evolve it . . . hence the concept that I think we're
founding and we've been developing here about how to move forward with
services . . . the idea of a service nursery."

If indeed, one of the problems was mapping out influences on the development, then
perhaps here is the place where a framework, such as sociotechnical constituencies,
could find value. It would certainly help as a 'cognitive tool' in helping to build the
necessary metal picture of the complex of influences impacting development.

The Service Nursery

This brings us to what was perhaps the most entrepreneurial organisational


innovation which arose from the trial: the service nursery. The underlying concept of
the service nursery was Marcus Penny's, although Dave Swallow originally coined
the actual name. He picked it from within the discussions that he and Penny had
engaged in with respect to the strategies that were being formed at the time to
involve content and service providers. They felt that service nursery really did
capture what they (the content and service providers) were doing consciously, with
their learning process and the set of conditions that work.

The 'nursery's' prime objective was the provision of a 'safe' environment in which
you can do the restructuring/ re-engineering reorientation necessary for expanding
existing business and business practices into digital media. They were now
consciously selling the PSPs - Nat West and Tesco and so on - a notion that there
was about to be a huge change in the way people sold and accessed goods and
services. This was a time of course when it appeared that there was something of a
competition between i-Tv-type services and Internet-based services to provide online
access to goods.422 This was going to impact, not only their business, but all business
on many levels, and across many, if not all, industry sectors. The nursery presented a
real opportunity to get a feel of what will be involved in this new world. Participants
would have a head start on their competitors, developing new kinds of competency,
and building early alliances with others who were ahead of the field. There was the
suggestion that those who lagged behind in this revolution, would find joining at a
later date would more costly, and more resource intensive. This is very reminiscent
of the utopian forms of discourse discussed in Chapter 1.

422
The first ‘e-commerce’ web site is cited as being Pizza Hut in 1994. In 1996 the US Government
allowed the suffix “.com” to be released to the public domain for sale. This produced an exponential
rise in commercial web sites.
One of the early service relationships was with ITN. Om received permission from
ITN to use their material to demonstrate a news service. They also explored some
ways in which they might work together longer term. Marcus Penny saw an
opportunity to turn such relationships to one where Om could sell consultancy in new
media development. ITN at this time were searching for digital alternatives through
which to distribute its news content, and had already set up an on-line news link
where people could download news to their computers. However the Om/ITN
relationship was a mutual venture and did not generate revenue.
"I don't think that was a surprise I think I'd expected and we'd expected to
make business out of that and I think that still is . . . I think the new
realisation from dealing with ITN for example was that an end result that was
simply advice or knowledge was not sufficient that what was needed for the
service nursery proposition was actually providing engagement . . . and
participation."

Om was trying to sell ITN consultancy at a time when others, such as IBM were
prepared to provide equipment and help for free. Om would have had some difficulty
competing against that.

However, the real problem for Om was turning the vast amount of interest that the
trial attracted into actual business relationships. There were a lot of people at the
time who lay claim to some involvement in digital media, but in Marcus Penny's
perception they were lacking knowledge on specifics of technology and strategy. He
realised that if the objective is to sell STB's, servers and knowledge, Om would have
to go into a business relationship where they would have to educate people - which
represents genuine investment of time and money:
". . .[it was] rather hard work . . . I mean the service nursery actually when
you look at it supplies quite a lot of needs its quite a complex set of needs . . .
you need a trial as a reason to do it because until you've got actually
something specific to work with . . . either a service you can offer through an
existing delivery mechanism or something in which a group has been
participating as an exercise which is consciously designed for trialing . . . its
hard to turn an interest into an actual working partnership."

Penny had been wrestling with the issue of content and how to do it to build the
business case for the trial. He realised that the only way of stimulating content
provision was through engaging service providers which would be willing to pay for
the opportunity to learn how to provide services in this new environment. Offering
pure consultancy did not work due to the market was not sufficiently educated to
understand the possibilities, pitfalls and potentials of i-Tv. They would not be
prepared to pay money for 'pure advice'. The service nursery was Penny's direct
response to early thoughts on how to entice paying service providers into the trial.

The service nursery has analogies with product and consumer tests with an actual
working prototypes, all be it at a more systemic level. The more one gets towards a
viable, working, fully configured system, the more reliable/viable/realistic - the
results from any feedback one gets from partners, managers, trialists and so forth.
The notion of the trial as a 'living environment' for testing and trialing services arose.
The living environment alludes to the real world situation in which the technology
and content/services are consumed and used – what I have referred to a kind of
experiential 'biosphere' (Nicoll, 1999). The intention is that it will provide
naturalistic data above and beyond that which may be gathered by other methods
(focus groups, market surveys etc.). This kept in adherence with the underlying
principles of the service nursery concept as seen by Marcus Penny. Such a way of
trialing technology and its market potential is also a kind of grounded theory (i.e
Glasser and Strauss, 1967; Glasser, 1978) approach, but applied to technology
dissemination. Issues, including user perceptions, feedback into a wider constituency
of learning. It can suggest not only how a companies business may be 'impacted
upon' or even 'impact' this 'new media age', but more immediately, how the business
may be reinvented in the light of these sociotechnical changes:
". . . in this service perspective the idea that there such as thing as service,
and it is an issue . . . because we're used to, we know what television is, we
know what books are . . . the idea that you have to consciously think about
and shape and direct the nature of the service rather than just supply content
to it is a fairly new one . . . its a story that seems to touch a chord . . . because
I think there are a lot of people out there who realise that the existing pattern
of services that we're used to and have been established for and remain
unchanged for decades are inevitably going to change . . . and what do you do
about it . . . and I think what we're doing through participation in the service
nursery is giving service providers some control over their destiny . . . in that
they understand things about service provision which they probably didn't
understand before because though they did them they weren't conscious . . .
and they can then . . . they are also developing skills and working experience
and so on the basis of that knowledge they can actually take their business
forward . . . there are lots of people out there in the service industry across the
whole sector whether its banks or retailers or entertainment . . . we used to
providing this the whole ground is shaking under our feet and we can't do
anything about it . . ."

This was perhaps an idealised vision of what the service nursery was capable of
providing. There were of course many problems that were purely technology matters.
Non-operational technology had primacy as a matter of concern over matters of
content provision. For instance Om had been through this system learning experience
the previous year [1994] when it realised there was a 'big hole' in the switching/
delivery mechanism, particularly the network switches - the 'squids' - which were a
significant part of the system. This was something which, in theory ATM were
responsible for, however in practice it did not sufficiently fit their business interests
for them to actually solve the problem, and this led to the inclusion of SJ research
(switching experts) as a further member of CITVIC.

User Research

On the subject of user research, and in particular, the issue of on-line questionnaires
etc., Marcus Penny was well aware of the potentials of the system to produce
knowledge and learning. He was conversant with notions of how explicit and covert
ways of realising what people are doing with the system had value. However, his
preference was for inference from 'what they do' rather than 'asking them questions'.
Of course this was something which was just becoming possible for the first time,
and was an inherent aspect of digital system use. The sys-log data produced by the
system would reap data illustrating which household, were using which
service/programme, and what the interaction style was. It would detail how they
respond to, and chose items presented on the screen. His opinion was that this
approach would lead to 'real answers' (as opposed to questionnaire where people
could 'lie').

This was recognised as a powerful feature and indeed, was the main enticement for
the involvement of National Opinion Polls (NOP's) interest in exploring the
potentials of the system. Marcus Penny claimed that Om will be working with them
to make the correct inferences from the sys-log data, which needed considerable
software development of the analytical tools. With this data Om, and the PSPs would
then increasingly to tune the choices that they would present, working in an iterative
fashion till the right inference is made.

Om and NOP would also provide questionnaires on screen for people to do, but
Marcus Penny felt that there were stages beyond that which they wanted to get to.
These stages were characterised by not providing questions, but rather providing
experiences or experiential choices, and monitoring reactions. He viewed that there
was a "whole new approach", an entire new way of market research, which surpasses
problems of interpretation inherent with questionnaire use, this included the
incorporation of vignettes style programming where people would respond to
alternatives.

But these visions of innovative market research were not yet realised in the real
system. User-centred research/design is made more problematic when you have fully
operating system with all its branches and avenues open. And this was more the
status of the trial system at this time.423

Such a view bears strong relation with social shaping theories of innovation, opposed
to more simplistic models of linear innovation, but rather recognising and bringing to
the fore feedback loops happening at all stages of the innovation-diffusion
continuum. Returning to the theme of order rising from chaos, he viewed that
standards formation arises from such crises, 'if you believe in the approach natural
standards emerge out of a dynamic process and are stable because the system keeps
them in place'. He sees that crucial to the role of the 'new manager' (one that is in
keeping with the new style of organisation) is essentially management of processes
of crises and chaos:
". . . its something which you cannot plan and direct in the way that your used
to it in a mechanical view of the world nevertheless there are structures and if
423
Indeed, as will be more fully illustrate in the next chapter, the trial never had all its menu branches
and programming fully operational, nor did it properly refresh content to a level which would
stimulate trialists to regularly access and use the system.
you understand the behaviour particularly in the moving from stability to
another . . . you can encourage that process . . . if you understand what lies
behind that stability you can encourage or interact with it . . . but you've
actually got to observe quite closely what's happening."

The services offered on the trial

The nature of services is something which is much more fixed as social phenomena
than multimedia content is as a technical one. The system architecture and general
underlying concept of interactive on-line services are, to a large extent, analogous to
existing real world institutions and practices. To order goods through the i-Tv system
relates to the already existing practice of using a credit card hotline, to order a video
on-line relates very much to going to the video shop, choosing and renting a video
and so on.

There is clear relation from the user's point of view in the method of appropriating
on-line goods and services and how they will appropriate goods and services in the
real world. The way in which the goods are paid for, are called up from stock, and
are finally delivered, mimic in an electronic fashion the processes and relationships
which exist between the retail organisation, its bankers and its distribution networks.
On-line services bear direct relation to their real world counterparts - they are virtual
representations of real world practices.

Content, on the other hand is a much more ephemeral product or phenomena than
services. Content relates to the style, interactive elements and aesthetic components
of what is presented or represented on-screen. It may represent a company, product
or service and essentially consists of variations in screen layout, buttons, graphics,
video, etc. - essentially it is the interface between the user-consumer and the
company or institution which is providing the service or selling a product. it has both
functional and aesthetic properties, and as such requires a high level of creativity as
well as innovation in its production.

The services offered on the trial (as June, 1995) included:


 Video On Demand (VoD): including TV programmes, films, music
videos, documentaries and features.
 News Service: national TV and radio news from BBC, local TV indexed
news and weather from Anglia TV.
 Audio On Demand: BBC radio programmes including drama, comedy,
sport, current affairs and education.
 Home Shopping: well-known high street names, including Tesco and
The Post Office.
 Home Banking: NatWest customers can access their bank accounts and
perform transactions.
 Education: interactive learning programmes from Open University, BBC,
Sherston and Anglia Multimedia.
 Information: leisure information, superfast teletext service, World Wide
Web access. Consumer Research: online questionnaires and other forms
of direct user feedback and involvement.
 Programming: interactive guide with timings to broadcast TV
programmes.

Marcus Penny believed that they understood the essential differences between
content and services "more clearly than anyone else in the world." His solution to the
content problem was to stimulate the creation of services. From previous interactions
with service providers there was the belief that they would understand the services
perspective sufficiently (opposed to not understanding the technical perspective) that
they would latch on to that and be persuaded to invest both time and money.

Making the Trial Autonomous

As previously mentioned, some quarters within Om viewed that The Cambridge trial
had served its purpose in highlighting Acorn's role within the emerging TV-centric
information market. In many respects it had served a similar purpose as the 'concept
car', or a conceptual architectural plan. It served as a showcase to the world but now
it was a drain on valuable development resources. There were now those who felt
that the Cambridge Trial, and the orientation of Om should remain true to the Om
mission statement - that of creating and developing consumer orientated network
technologies, but were not convinced of the network infrastructure which would
deliver this. They felt pragmatic and responsive to the real business opportunities that
it had brought, and recognised that valuable development resource should now be
concentrated on non-cable/ATM applications.424

However, there were those who, along with Marcus Penny, remained convinced that
there was more mileage in the cable/ATM infrastructure, and who wished to follow a
more content orientated approach to financing development. Further, there were also
those who were looking ahead and could see that terrestrial digital television was a
further opportunity on the horizon, and felt that development effort should pre-empt
this. On top of these problems of a more strategic nature, were more immediate
problems within the trial itself, most of which centred on the provision of content.

However, at this time the trial was basically content starved – the trialists were
simply not using, as there was nothing to use. There was a desperate lack of content,
as well as a lack of turning over (i.e. updating) what little there was available. While
this was due largely to copyright problems, it could also be attributed to content
development problems. Om orientated itself to be a STB manufacturer, after all
Acorn's competencies were considered first and foremost in the field of computer
hardware development. However it was known early on that it was essential to
generate content for no other reason bar the fact that the STB was a medium. In order
for the users to operate it and to showcase the technology at trade shows it had to
carry attractive content material. Users or potential clients could hardly judge the box
on its own merits or produce any sort of meaningful feedback unless it carried
content.

The accumulation of content problems, and the pressures within the company to kill
the project, resulted in those responsible for the Trial team developing their own
business plan. What emerged was a strategy that had been embryonic for some time
and whose underlying assumption was that [potential] content and service providers
would pay to 'learn by doing' or rather, 'learn by participation'. The trial would
become the learning environment of the service nursery. Economically, the
Cambridge Trial was to become financially autonomous, the 'children' paying an

424
Making the generic STB suitable for specific purposes, whether for CD-ROM use or for modem
use, presented their own unique development problems. These required time and resources which had
to be redirected from trial development.
entry fee which would keep it afloat, financing the system's development and
architectures. In practice the service nursery comprised of a number of working
groups and management boards (see below).

Working Group
on System
Architecture
Working Group on
Copyright and
Legislation

Service Nursery
Management
Board

Working Group on
User Interface Working Group on
User/Marketing
Research

Secondary
Service
Providers
(SPSs) -
services (i.e.
interface
design) for
PSPs

Fig. 6.2 The service nursery: The PSPs are on the outer ring, with the Management board
central surrounded by the working groups

My main purpose for being involved with the trial was to conduct usability studies
and more general user research. As such I was invited to follow developments in the
user research/marketing group. This group would be responsible for:
". . . looking at the plan for sending questionnaires and conducting user
interviews . . . The intention would be for this working group to define a
number of questionnaires and points at which users are contacted and to
ensure that this is coordinated so as not to overload them with too many
interviews and questionnaires." (Cambridge Trial project manager)

The working groups were to be facilitated and chaired by Om, and each would
comprise of those staff of the PSPs, who had an interest in or whose jobs were
concerned with market and consumer research. Also a number of secondary service
providers (SSPs) would act to provide PSPs with necessary expertise or skills that
they may be lacking – multimedia authoring, data- and image libraries etc.

Chapter discussion

What I have tried to outline in this chapter is the organisational elements that shaped
the trial, and led to the development of the Service Nursery in the Cambridge trial - a
trial which stands as testament to the complexities of technical and social
management in the new media age. To a large extent I have concentrated on the
individual view of Marcus Penny, the 'constituency builder' and champion for the
development of content and services, and who developed the concept of the services
nursery.

The promise of a trial means that new cutting-edge technologies are to be tried out en
situ - the naturalised settings of their intended physical, social and cognitive
locations. For that to happen it is clear that as much as possible the technology must
maintain a high degree of projected use characteristics if user-consumers are to be
employed as generating useful and meaningful data. Perhaps it is the same problems
as Donald Norman identified at Apple in respect to effecting HCI strategies in
product design: "10% of the problem involves the science and engineering
knowledge of HCI, 90% reflects the social and managerial side. The real problem is
one of attitude, which then gets reflected into organisational practice." 425

Also, Erlandson et al (1993: p.16) see that naturalistic modes of inquiry are very
much dependent upon context, they are "bound together by a complex web of unique
interrelationships that results in the mutual simultaneous shaping [process]." Such

425
Donald Norman ‘Where HCI Design Fails: The hard problems are social and political, not
technical’ BAYCHI February, 1993.
processes represent well the constituency building processes in trials. The results of
the trial, the learnings arising from the trial, were understood (at least by Om) to be
fundamentally explorative and experimental types of process:
"The original partners knew that they could only predict a certain proportion
of the outcomes of their involvement. But all knew that they were committing
to an experiment and as such nobody could state with 100% confidence what
the results would be." (Om promotional literature)

However, any notion of the users being important 'co-researchers' or 'joint explorers'
of the potential of this technology were lost under the weight and complexity, the
shear attention demand of the organisational elements. It is clear from the discussion
with Marcus Penny that although he championed the services and content aspect of
the project, there seemed a kind of engineering/ technical feel and logic applied to
the social construction of the service nursery. It was a very much 'suck it and see'
mentality which is more commonly associated with a more scientific or engineering
type of problem solving and logic.

Lust of result, desperation to get value for investment, coming to grips with the
contrivances and the unique problems of conducting trial research in the new media
age, as well as lack of real investment in trial organisation could all be cited as
reasons for why their was no right management of the user research project.

Understanding of these problems has motivated recent work at Edinburgh (Molina


and Nicoll, 1996). Contexts of use, particularly the demo STB's ability to represent
and symbolise the potentials of the system, convinced not only members of the
Acorn board that this was a good idea worth funding, but inspired members of the
design team itself. Marcus Penny had his visions of the domestic networked future
legitimised by the symbolic prowess of the demo box. Many others, at trade shows or
visits to Om's premises were also charmed by what it projected. The demo box as
symbol, represented a considerable constituency-building force.

Trials in this sense are distinctive constituency building processes which shape the
way technology develops, how users are enrolled and what is expected from them, as
well as how technology and financial partners are found and eventually impact
development. Given the problems that may arise in the management and evaluation
of trials, such an approach may function pragmatically as a generalised approach for
the conductance of user research programmes (scaled towards the given
circumstance) and aiding product and service development. Alternatively, it may
guide trials, integrating user research as part of the more generalised and macro level
strategies and within wider and more external shaping forces.

From simply ensuring that specifications were competitive with other products, the
Cambridge Trial came under an increasingly complex array of forces which shaped
the technology as well the way in which the trial was managed. While this was
mainly due to the needs generated by potential and actual clients, it also included
feedback from trade shows and demonstrations of the technology from potential
clients. There was also informal feedback from the wide media coverage which the
trial had drawn. These created an ever-increasing set of demands both upon the
technical specifications of the system, as well as the management contingencies
which administered the trial. The stakeholding theories of the service nursery
concept, which was designed to make the Cambridge Trial economically autonomous
from Om eventually produced a number tensions impacting the governance of the
trial, as well as specific development activities including implementing the user
research (more fully illustrated in the next chapter).

The Cambridge Trial was an ambitious project fuelled by a number of different


influences. The suggestion was that it was staged as a condition for funding money
from the city. However, it came to be widely understood as a PR opportunity, as well
as a way by which Om could profile itself as dipping the water into the realm of the
suspected consumer market for i-Tv. As a result of the continuing development and
evaluation of the technologies involved and the learnings arising from the
development of the services, new opportunities arose for sales totally independent of
the trial. It gave Acorn credibility in the arena for advanced media products intended
for the consumer market. Some of their more recent developments come on the back
of technological development and business synergies which came about as a result of
conducting the Cambridge Trial.
The fact that the Cambridge i-Tv Trial continued to be a test-bed for new and
developing technologies meant that the participants (both technology and service)
were collectively able to lay claim to a number of 'world firsts'. This was supposedly
in terms of the technological achievements and the services that had been
successfully developed, delivered and evaluated over its infrastructure. However, it
was not all roses, there was vast gaps in the social and technical constituencies which
opened as the trial and the service nursery progressed. This was far from the claim
that as new technology was developed: "the Trial moved closer to its original and
central objective: the full testing and evaluation of the technical and commercial
viability of supplying interactive services across the information superhighway
directly into homes, schools and businesses."

One of the most major problems that had serious implications for content
development, was Om's decision to use a proprietary software system to create
content and pages. Terry York - Interface Designer felt that this was a hangover
from that part of Acorn culture which had a predilection for individuality and
differentiating itself from other companies426. However, this meant investment from
companies which expressed an interest and intention to create content - i.e. the PSPs
- both in terms of appropriating a Risc PC, and the necessary training in order to
operate the system and accompanying software. It was mainly due to this that BMP
DBB Needham's test advertisement for Walker's Crisps, created by Macromedia
Director software (an industry-wide de facto multimedia authorware) was unable to
be broadcast through the system.427

The eventual product, more identifiable as a domestic NC, provided an inferior


access to the internet than a PC/modem set up. Moreover, it did not meet the vision
426
It was felt that it was only in recent years that they have tried things to make technology and
software more industry standard. The RISC PC, for instance is the first Acorn machine to have a
standard PC keyboard. This had immediate benefits in terms of cutting manufacturing and production
costs. At the time of Om’s development of the STB, Acorn were only just in the process to support
JPEGs as standard in the OS. There was a cultural reaction to this at Om which stated that everything
they created would be industry standard, they were not going to do anything which was proprietary as
it was realised it would cause problems.
427
Om and Olivetti in fact signed a joint development and marketing agreement with Macromedia in
Jan 1995. However, in mid-1995 they still had standards problems for the production of services.
which was sold to trial participants some time before - that of full blown video-on-
demand, home shopping and banking (as will be illustrated in chapter 10 which
addresses the user impressions of the trial). The Cambridge i-Tv Trial was
nevertheless an example of vision which viewed:
" . . . a partnership not only between the companies who jointly run and
administer the operational side, but between them and the companies who
have decided to develop services to run over the Cambridge network, and
more importantly between the service companies themselves. By pooling
their resources, working methods and knowledge, this powerful union
generates many benefits. Equally important are the Trial's users, whose active
co-operation is a vital resource for research and feedback." (Om promotional
material)

In the light of my own experiences (which are more fully explored in the next
chapter) within the social mechanism of the service nursery, such a view strikes me
as utopian, and perhaps overly optimistic. The service nursery was a concept which
relied on the development of self-organising groups, which would somehow gel out
of their intrinsic interest in learning.

While some aspects of the larger constituency may have reaped benefits, some of the
working groups obviously failed in their missions to deliver material, research, or
content. Many of the service nursery's constituents were never represented on the
trial. Those that were, only offered dramatically reduced or demonstration services,
which provided little to feedback upon. Also recruitment to the various phases of the
trial was severely plagued with poor response. When the trial went 'public' in phase
two, the target number was never met, nor was the blend of participants what one
could call a 'representative sample, nor even for some service nursery members a
'relative' sample428.

Conclusion

This chapter has demonstrated:


 That technology trial of new media are indeed prime examples of sociotechnical
constituencies - they comprise of complex and sophisticated system technological
elements (state-of-the-art network, video and storage technologies), and equally
428
I refer in this case, particularly to Nat West. They had very few customers within the trialists.
sophisticated social and business elements which included a strong emphasis on
the selling of visions made carnate (i.e. demos), alliance building (converging
under a promise of continual learning, shared interests and concurrent
development), agility (i.e. fast response to client needs and market opportunities),
and development of business practice (constantly evolving strategies and
orientations).

 The tension (and need for symbiosis) which exists in the development of media
technologies and media content. Viewed as a system technology, content is a
major component (as for Edison it was electricity, for Bell it was business
subscribers and for Baird and the BBC it was programmes and schedules).

 How Om were forced to consider the content and services position almost from
the start of the project, and something of how this was viewed as foreign to their
business, driving the trial to become financially autonomous, and largely creating
the need for the service nursery concept. Also, another major factor driving the
need for autonomous content provision was the circular problem of getting media
companies to commit content for a trial - primarily due to economies of scale (i.e.
it is not economically viable to release a brand new film to 100 people), and how
this drives the need for partnerships.

 The sort of relationships which are happening within the sector - i.e. an
unpacking of the so-called converging alliances of service and content providers.
Institutions tend to have varying motivations and run under their own inertia.

 The impact of the trial and the concept of the technology as a catalyst for
innovations, and how these were recognised and viewed as part of a value-added
package for potential and actual partners. Others parts included access to user
data and an opportunity to 'learn by participation' (i.e. Om supplies the
technology, PSPs learn how to exploit it, as well as develop the ways of working
together).

 How the new media sector in some cases leads, and in others acts as a prime
example of how the nature of business is changing.

The partnership of Acorn with Oracle to produce network computers (NCs).


"The original partners knew that they could only predict a certain proportion
of the outcomes of their involvement. But all knew that they were committing
to an experiment and as such nobody could state with 100% confidence what
the results would be."429

The Cambridge i-Tv Trial evolved beyond the rather open-ended initial objectives set
by the partners. Indeed its success was claimed as largely the fact that the partners
developed their own objectives and aspirations alongside the common goals of the
429
Om web site.
[technology] infrastructure consortium. It was suggested that all were committed to a
long term strategic future in this market. Much has been written about interactive TV
in the meantime, and everyone would agree that much has been learned. What was
realised was that the interactive TV 'big bang' was in fact still some way off - though
everyone left the trial 'more positive than ever that it is going to happen' (Om
promotional material).
Chapter 7 – Access
"If every product is really a service, then every contact or communication
with customers is also the product." (Kantor, 1992: p.10)

"The world is not what I think but what I live through." (Merleu-Ponty,
1962: p.xvii)

No man's knowledge can go beyond his experience. - John Locke

In a sense, the mechanical intelligence provided by computers is the


quintessential phenomenon of capitalism. To replace human judgment with
mechanical judgment to record and codify the logic by which the rational,
profit maximizing decisions are made-manifests the process that
distinguishes capitalism: the rationalization and mechanization of
productive processes in the pursuit of profit ... The modern world has
reached a point where industrialization is being pointed squarely at the
human intellect. (Kennedy, 1989: p. 6)430

430
Noah Kennedy, The Industrialization of Intelligence: Mind and Machine in
the Modern Age (London: Unwin Hyman, 1989), p. 6.
Introduction

Earlier in chapter 4, I outlined that the 1990s witnessed the adoption of ethnographic
styles of research over a number of distinct fields and disciplines. I have also
indicated that the present study moved from one which was had a preoccupation with
users (i.e. their perceptions of the system) to one which focussed more upon the
organisational situating of the consumer-user research projects within consortium
organisations. Any attempt to perform worthwhile user research and evaluation of
the system relies upon the availability of a robust delivery mechanism, as well as a
constantly refreshed substantial series of enticing content material. The prototype
must be a good working representation of what it is that will be sold in the public
domain. If it is not, as was the case with either the Om stand-alone demonstration
STB, or with the Cambridge trial itself, the relevance and veracity of any user
feedback is limited. For instance, the system's nearest comparison in user's mind was
of course broadcast television, video recorders and video games. Broadcast television
is the epitome of refresh – it provides a constant stream of new material. Lack of
refresh is akin to having a library which is only ever stocked with a few books,
themselves attractive to only a select band of people, and with many of the pages
missing. This was similar to the performance of the Cambridge system. Due to
weaknesses in the systems performance and a lack of content the trialists simply lost
interest in using the system.

The reason for this predicament I cited earlier as arising chiefly form the
complexities of governance and organisation. This chapter aims to illustrate
something of this complexity from a first hand account of my experience of 'getting
involved' with the Cambridge trial. If one's focus, as it within the present research, is
upon the notion of collapsing 'cultures of production' and 'cultures of use', it remains
equally relevant if one studies this phenomenon from either the organisational or
user perspectives. This was indeed the case here where both perspectives were
sampled and considered. But it came to be realised at the analysis stage of the
research that the process of constructing and approaching the user research form the
organisational perspective provided a much richer, and I feel, more pertinent and
relevant account, of the process from both an academic and an industry perspective.
It was according to McMaster's (1997) account of chaos theory: "the area of most
information."

Minimising the constraints of preconceptions

A challenge in ethnographic research, following phenomenological approaches, is to


either minimise the constraints of preconceptions, or at least to acknowledge one's
preconceptions so that one allows another's experience to be communicated in a
relatively undistorted fashion. This is the stance of social sciences, and in particular
interpretist social science wishing to become recognised as rigorous. Moores (1993)
draws attention to the notion of reflexivity (or lack of it) in ethnography, citing the
work of James Clifford (1986):
"Clifford talks of the necessity for a 'specification of discourses' in
ethnography. Anthropologists should be willing, he says, to specify who is
writing, about whom, from what relative position, and in what material
circumstances." (Moores, 1993: p.64)

Moores (p.65) also notes that: "As a stranger in the living room, his [Morley's]
presence would surely have been a significant factor in how the conversations were
organized and what people were prepared to tell him." 431 Hence, putting a study in
any kind of theoretical framework best be postponed until after the researcher has
reflected on his or her own guiding assumptions and metaphors, in order to: a) obtain
better insight in what he or she is bringing to the research setting and participants,
and b) to obtain an understanding of how that would influence the research
experience itself.

As previously explained the original intention in this study was to conduct usability
testing of Om's trial technology. The results of which would serve to feed back to the
company with relevant data that could be used in their product development process.

431
An example of how ‘unnaturlistic’ it can be to have a researcher in the private space of the home is
graphically depicted by Walkerdine (1990) in her strongly self-reflexive, ethnography. Interviewing
teenagers at home she was announced by one girl’s father by: “Joanne, here’s your psychiatrist!” This
highlights something of the preconceptions people have of being interviewed and researched, and how
this can produce a particular mind set which may interfere with the aims of the research. Another
example is given by Ellen Sietier, (in Seiter et al., 1989) who expressed considerable frustrations
when things did not go her way when conducting an interview.
This would also serve as a basis for this book, and the 'grounding' for the theoretical
aspect of contextual usability. The intention was to explore the symmetry or
asymmetry between design and use, production and consumption. It was envisaged
that I would track the development of the product and its CAFFs, and explore how
they would be interpreted [or re-interpreted] by consumer-users.

This original intention was largely based largely on good faith in press reports, early
communications with the firm, and my own pre-conceptions of the issues which
would arise as the trial unfolded. As time progressed the original intention was
thwarted by a series of obstacles, mainly due initially to hitches in the roll out of the
first and second phases of the trial (and commercial sensitivities regarding this).

Marcus Penny, Om's senior manager in charge of services felt that there was a "very
important public interest perceptive from a very fundamental aspect . . . if [i-Tv] is
not seen to be broadly in the public interest we won't be allowed to do it." This
perhaps hinted at the way in which my study was perceived. However as the social
relations between outside agencies and the Cambridge Trial became more complex,
so my role came under new pressures, partly due to politics within the service
nursery working group on user research. As will be illustrated by this 'process'
chapter, my position as a researcher, as well as the research itself, was compromised
due to the real-time, real-life and 'chaotic' dynamics of a highly-pressurised
technology and marketing development process.

The Cambridge Trial bred a complex development environment where casual and
unexpected interactions with outside agencies appeared to create quite radical
changes in strategy and orientation. Conditional access to the trialists became a
matter of further negotiation as the trial evolved. Nevertheless, the research, which
was conducted over a period of years, developed an interesting secondary frame of
analysis - logging my movements or my 'navigation' towards an ultimate goal of
conducting the user research.

This 'navigation' presented unique insights into the governance and social structures
of the trial, and, as a product of opportunity, guided, along with the theoretical
contributions of sociotechnical constituencies and contextual usability, the
ethnography. This was a study which could not of been contrived. It could not have
arisen given the normal course of research funding and implementation, since it
could not have been planned.

I have already drawn attention to the macro-level influences, standards, protocols,


public feeling etc. in which the trial, as a social and cultural phenomena, was
situated. Both theories - sociotechnical constituencies and contextual usability -
sensitised my awareness of contextual and environmental issues regarding where the
project was going and why. They influenced the types and kinds of questions asked.
For instance, many individual aspects of the trial constantly evolved, and evolved
concurrently. One of the few constants that held up over the trial seemed to be the
perceptions of the users - whether they were consortium members or general public -
they remained largely unimpressed by the system's actual performance. However,
uncovering their true feelings and understandings regarding the technology and
content, always appeared low in priority in Om's agenda. This was most notably
against the background of PR and publicity opportunities that the trial presented.

In the first chapter, I have drawn attention to the way in which 'visions' can mix with
truths and non-truths in terms of actual or real development. Sometimes this can go
as far as firms reflexively 'taking in their own propaganda.' This questions the
veracity (or even choice) of academic research on commercially sensitive subjects.
Many studies of firms and technology development report as if total access were a
given. Very few discuss at length the difficulties experienced regarding access or
poor reporting of actual events by company personnel. One of the problems of
relying on what 'one is told', is that the real dynamics of the processes remain
unknown, and one simply cannot, in the early stages, develop contingencies for pre-
empting questions or even proper approaches432. It is only through involvement at
some depth, using 'lived' 'action' or ethnographic style of approaches that real

432
This, of course, is not a situation which is peculiar to company research and commercially sensitive
subjects. Any self-report, particularity that which relies on memory of events is open to distortion due
to re-interpretations, and other influences such as social acceptability.
processes may be understood. It was in such a style that this research was conducted.

There was an explicit social construction of the user research within the service
nursery. A working group was dedicated solely to its development. However, Om
relegated most matters of governance and management to the group itself. If indeed
the service nursery were to be a learning environment, it is difficult to imagine it
becoming simultaneously autonomous in its governance. The relations between the
working groups and the management board seemed unclear, as to my knowledge no
minutes were ever taken in the user-research group. Expanding on its own metaphor,
it is akin to having a crèche run by its children. As Marcus Penny had it the service
nursery existed; ". . . because there is a common interest that it should exist." As this
chapter will explore such a 'common interest' was challenged by the lack of a clear
approach to how the user research should be conducted. Some commonality of vision
had been reached in the technology. The technical needs of the system- i.e. getting
the system components to work in concert - were problems of a reasonably tangible
nature. Engineers have the common vocabulary of specifications, requirements and
standards through which they can communicate and reach consensus - within and
across companies. However, the user research and, as detailed in the previous
chapters, other matters such as the development of content, were perpetually
hindered and confused through technical, organisational or recruitment complexities.

This chapter includes original e-mail material, and although I have anonomised the
various people involved, where e-mails are cited matters of spelling and format
remain as in the original document.

First contact

On the 3rd July 1994 an article appeared in the Sunday Times 'Innovation and
Technology' page. The headline read "Acorn grows interactive TV". 433 I approached a
colleague who had been carrying out research on Acorn and was provided with a
contact name and number. First contact with Om was made in early August 1994,
about a month before the publicised launch of the trial. A positive reply came from
433
“Acorn grows interactive TV” Innovation and technology The Sunday Times 3rd July, 1994 p3/10
An advert for a Pentium 60mhz appeared on the same page. This serves as an index to what the state
of PC development was at the time, and what was on the market.
Om, via Gary Nelson who was the technical project manager at Om, and who was to
serve as my linksperson with the company. He expressed interest and asked for a
proposal. In this, I detailed two intentions. The first was to conduct user research on
participants on the trial - to ascertain something of their interpretations of the
technology. I would also test the system for usability.

I had already became aware through literature review that usability testing as a
practice was expanding its horizons to encompass a wider set of prerogatives
regarding the relationship of the user to the act of using (as detailed in chapter 3). I
had already done some thinking on interpolating this expanded notion of usability
with work drawn from cultural and media studies. Contexts of use were deemed
particularly important in cases of use of domestic media (Silverstone, Morley and
Hirsch, 1992). I had also become aware that producers of new media products were
apparently at a loss in methods predicting the success and failure of their products 434.
I had a basic working concept of CU at this stage and sent an initial proposal to the
Gary Nelson on the 19th of August 1994 (two days after initial contact).

However, it appeared that there was little time in which to prepare my research plan
as the official launch at this time was ten days off. I had a concern to implement a
research programme before trial participants had been exposed to the system and its
capabilities. Keeping in line with the spirit of evaluation research the original
proposal had two main thrusts:
 A study of the forces shaping the innovation process itself, from the company
perspective.

434
For instance, Scott McDonald, Director of Research at Time Warner Entertainment, discussed the
impact of the merging of computers with communications channels and information and entertainment
providers. In a talk at CHI '95 entitled 'Learning from Diversity: Interactive TV, Computers and the
Frontiers of the Cognitive Sciences,' he spoke of how the knowledge of computer human interaction
derived from conventional computer systems and their users is not necessarily applicable to the new,
larger combinations of computers, communications and entertainment. He sees users of new systems,
such as i-Tv, having a considerably different 'mindset' from traditional computer users. He sees that
the very broad choice of material available will have to be presented in new ways to them. Also,
Logan (1994) points to the fact that consumer- and entertainment- oriented products (including i-Tv)
demand an expanded definition of usability, while recognising the fact that this expanded definition is
'in-the making':

"At TCE [Thomson Consumer Electronics] we are committed to going beyond [traditional
usability], although what beyond is, is not completely clear." (p.61)
 A series of quantitative and qualitative interviews with trial participants.

My initial request was to interview Gary Nelson in order to set some context for the
project, and then conduct a preliminary guided interview with participants before the
installation of the sets. This was to be subsequently followed up with semi-structured
interviews after two to three weeks (locating initial impressions and problems in
use), and a final interview at the end of the trial, or after six months (after the
technology 'domesticates').

I included a draft transcript of the sort of questionnaire that I had developed, which
was based on existing media use and consumption as well as usage patterns, and I
envisaged that this would provide initial data through which to base comparisons on
subsequent follow-up interviews. I welcomed feedback or questions concerning the
structure of the proposed project in general, or the questionnaire in particular. I also
suggested that he kept me informed of any additions or points of stress that he or the
company would consider of value for their own purposes.

The Gary Nelson replied the following day with a confirmation that he had received
the questionnaire and proposal, and had received the OK from Dave Swallow of
Online Media on the project, providing it did not cost them too much in time. He
was to go on holiday from the 26 th of August till the 2nd of September, when he
would provide feedback on the questionnaire schedule. I would use the time
remaining to the launch to conduct some pilot testing of questionnaire schedules. It
was also clear that the limited time meant that any preliminary study would have to
be on a small and manageable sample (<10).

Second contact

My next contact with Om was on the 6 th of September, to prompt some arrangement


for me to visit their HQ, to which I received a reply on the 9 th, stating that the Gary
Nelson would be happy for me to visit. They had chosen the phase 1 subscribers, and
he asked if it would be a good idea to start the initial interviews with these. They
were still targeting the end of September to begin the trial, but it was 'looking tight'.
They were also in the process of initiating an education 'sub-project' to the trial, with
the intention in phase 2 (planned for early '95) of including a number of local
primary & secondary schools.

During this time I had pilot tested my questionnaire, and was now developing a feel
for how people generally conceived of television and media technologies. I was
experimenting with a number of different approaches regarding questionnaire
implementation - whether it was more effective presented verbally; whether it was
better presented to each household member individually, at different times; whether
it was important to note any transactions between members etc. The pilot sample of
15 households were recruited through informal networks. They came from a wide
range of backgrounds and demographics. This was purposeful, as I (erroneously)
assumed that the trial participants were to be an eclectic blend of average television
consumers, and wished to emulate something of this in the pilot study. I also passed
the questionnaire round a number of the academic staff at Edinburgh in order to elicit
further advice and feedback.

Outcome of pilots

The pilot study and feedback from academic staff drove several iterations on the
questionnaire content (both structure and individual items). A number of points
emerged regarding elicitation of people's knowledge of everyday activities. What
some respondents had answered in the questionnaire, contradicted information which
came out in the informal chat which followed the questionnaire answering. This
prompted me to ask for their reflections on the questionnaire process itself. There
seemed that there was some consensus among respondents regarding the artificiality
of the process - sitting filling in a relatively large questionnaire survey, while I sat
observing, and other activities (such as kids playing) interrupted the otherwise silent
period of completion. Further, some respondents had extremely strong points of view
regarding the use of television, and others (a BBC television producer) had apparent
knowledge of the area of i-Tv. He was antagonistic towards any claims regarding the
implementation of a trial. In most of these occasions the discursive aspect of the
research - talking - was very rich.
I noted a tendency in those interviewed to offer contradictory accounts of their media
practices. In one case a mother of a young child was adamant that she did not allow
excessive viewing of the television, preferring instead to do 'constructive' activities
with her young daughter. Later in the conversation she related occurrences in her
everyday life to events in various soap operas, and remarked on her daughter's
references to various television commercials. These were quite obvious instances of
where self-report was influenced by social acceptability. In this case, feeling that
viewing television was somehow detrimental to an image of a family or parent who
pursues 'more' constructive, learning activities during their leisure time. In many of
the more affluent 'middle class' households which were interviewed it was felt that
television viewing was a 'wasteful' experience, absorbing time which could be spent
on more creative or healthy activities.

In those families which completed the questionnaire it was further noticed that for
some questions it seemed natural for people to seek clarification or reinforcement
regarding answering. In particular questions concerning amounts, quantities and
locations of media technologies prompted discussion or affirmation, as well as,
questions addressing each other's consumption or use behaviour.

The result of the pilot session brought about several initial questions:

The process
I became uncertain that a questionnaire survey was the best way in which to
proceed with asserting current media use and consumption patterns. Remote
administration of the questionnaires also seemed a problem as people may
compare and adjust answers.

The objective and outcome


What was the veracity of the answers drawn from the questionnaire? There
seemed a problem as different household members offered different
explanations when discussing the same phenomena.

The best way to approach

By appearing in person so much more can be learned regarding participant's


life and lifestyle, as well as their outlook and what is important for them.
Remote administration relied too much on the research instrument.
It was obvious, with limited real knowledge of the trial at this stage, that there would
have to be a 'learning by doing' aspect to the research. My feeling at the time
however was to minimise the possibility of contamination - i.e. I wished to approach
trial participants before they were exposed to either the technology or any
paraphernalia/media exposure that would help create preconceptions regarding the
capabilities and potentialities of the system. My attitude was one of learning the finer
points of the technological functionality of the system, as well as developing
knowledge of how they had selected their phase 1 participants:
"Obviously, it is crucial to the investigation to have some [objective?] idea of
the transactions that have occurred between yourselves and the households,
which may 'colour' expectations about the qualities and operational
parameters of the technology. When precisely the trial actually begins is no
problem from this side of the fence (more time the better!), apart from the
point of 'colouring' (i.e., if there is a massive pro or anti-interactive campaign
by the media, viewed, of course, by participants)" e-mail to Om Fri. Sep 09
1994

My attitude therefore was experimental in style, with the people's existing lifestyles,
media use and consumption as an 'independent variable'; the i-Tv technology was
viewed as a 'dependent variable'. Confounding variables to look out for would be
extraneous influences to the participants' perceptions or anticipations of the
technology (such as press reports, the comments of the installation engineer etc.)

At this time I was also considering the logistics of the project. I was of the opinion
that the first phase of my project should involve the administration of my
questionnaire schedule at a time which would be convenient for the participants
(those aged 16+). Two things seemed to be involved in performing this; one; making
sure that there was a convenient time to come round for an hour or so, and two; being
able to do two households a night (the presumption being that that was the best time
to catch people at home). I was further interested in the characteristics of trial
participants in terms of demographics, socio-economics etc. I wished to make a
preliminary arrangement to go to Cambridge on the 21st Sept., and staying till about
Oct. 1st. or 2nd. (leaving enough time to interview the families).
Three days later, I made a further request for some indication of trial participants, in
particular what was important (above and beyond demographic information) was
household composition. This would effect the logistics of the research. Also
important in this respect was their location in proximity to the centre of Cambridge,
and within themselves. With respect to this I requested any maps which would show
participant distribution. I further requested details of the selection procedure (i.e.
what they had been told about the product and the questioning procedures). I was
also interested in what the procedure would be regarding the deployment of the
systems (i.e. by an engineer, who will show them what to do etc.).

Lastly, I enquired regarding the perceived date and time that the units will be
distributed. As it is the battery of questions will take up to an hour and half. This may
take some logistical working-out concerning the arrangement of a suitable time when
all adult family members may be present. I would like to take them through the
questionnaire together. The making of appointments to do this would be the best
idea. Could Om handle this, or should it be done from Edinburgh? Also at this time, I
was thinking of video-taping the arrival, and initial explorations of the family with
the unit. Could this be negotiated with the installer and those households concerned?

Breakdown of initial assumptions

On receiving the responses to the recruitment form one of my main initial


assumptions were up-turned. This was the discovery that participants on the trial
were not 'average' consumers - i.e. anything like a random sample of people drawn
from the population of Cambridge. Most were employees of the Om, and indeed the
technical designers of the system. This constituted a crisis regarding the planned
research procedure. It showed me that I had developed considerable pre-conceptions
of the trial, and rather rigid in my approach. The designed approach was designed
with 'naive' users in mind - i.e. persons with little or no exposure to the features and
functionalities of the system.435 The group that was picked was indeed far from naive.
435
Much of the inspiration of the original line of thinking came from usability engineering, which
stressed stringency on recruiting representative samples of users (for instance Dumas and Redish,
1994).
Indeed, they were extremely familiar with the product. This sample would also
exhibit other relevant influences which would distinctly bear on evaluations and
interpretations of the system.436

Phase 1, even though recruiting from the Om 'ranks', only ever managed to recruit
eight trial participants. I sent out a request to those households that were on e-mail.
Replies were not immediately forthcoming. I was going to Cambridge the following
week to conduct my initial interview with Gary Nelson who was the Project Manager
of the Cambridge Trial. It was becoming clear that there had been considerable
technical problems involved in implementing their trial. Not all the homes were
connected. The launch of the trial did not entail anything like the simultaneous
connection and transmission of interactive services to 10 houses. The 8 homes that
eventually comprised Phase 1, would be connected over a period of several months.

This extended roll-out of the trial was a further blow to the rigidity of my approach. I
conducted the in-company interview but had to leave Cambridge without
approaching any of the trial participants. However, I understood fully at this time that
the protracted roll-out was due to technical problems of the system. Phase 1 was
always intended to be predominately a technical trial - i.e. a trial to basically check
the system's potential to transmit anything at all. It was pointless at this stage, from
Om's perspective, to explore things such as participants' existing media habits. Or
tracking these against preconceptions of the system and the way in which the
technology was deployed etc. This phase did illustrate something in the way trails are
structured. This was the genesis of the study adopting a wider perspective towards
the trial, as a study of process than simply the technology and users. These
developments also suggested something of a dichotomy opening between an
'anticipated' 'planned' trial and one which was more 'actual' and 'unfolding'.

The anticipated Trial and the actual Trial

As mentioned, the roll out to a full market version of The Cambridge Trial
436
For instance, being computer enthusiasts. This manifested in the language used in the form where
one respondent watched, “no” - described “zero” - television a week. He spent his entire spare time
using computers.
technology and content was to occur over a series of Phases. Each Phase was to be
indicative of how the system as a whole was developing - the robustness, reliability
and functional characteristics of the technology, as well as the sophistication and
scope of the content. The anticipated Phases comprised the following elements and
features:

 Phase one. This was to be populated mainly with Om designers. Very much in
the technology Trial end of the spectrum, it practically represented a period of
intense technological development and 'tweaking' of the system components. Om
designers as trialists on this Phase meant that rapid development was possible.

 Phase two. The population here was a more heterogeneous array of personnel
drawn from the members of the CITVIC consortium. Here, the delivery system
was expected to be reasonably stabilised, and attention would be concentrated
upon content provision, and the more 'experiential' aspects of the system.

 Phase three. Was the first instance of the Trial going public. The major issues to
be addressed here were modes of payment and packaging of various service
options, style and scope of content, as well as how often programmes would have
to be refreshed. Success in this Phase would lend credence to the mass market
potential of the Cambridge system.

The Phases were to be indicative of the scale of participation in the Trial, with each
successive Phase enrolling ever larger and heterogeneous populations. Phase one
(starting Sept. 1994), for instance, drew most participants from members of the Om
design team and had 10 members. Phase two (starting March 1995), was drawn
mainly from the CITVIC partners in the Trial, and was intended to draw 100
members. Phase three (starting at the end of 1995), was the stage where the Trial
would first go public, drawing the wider public in Cambridge (at least 250
households). As such, Phase three would serve as the testing ground upon which
understanding of 'real' situations of use would be gained. Trialists would serve as
Start of the
trial surrogate members of the general public, and as such, Phase 3, would
Fullserve as the
commercial
roll-out
'launch
Low level of pad' for the mass market, where system, content, and packaging would be
participation Large scale
finally tweaked (see figure below). participation
User
Consumer
Localised and
homogenous (i.e., Public and
designers and heterogeneous (i.e.
company personnel) indicative of mass
market and ‘real’
Technology not customers)
stable
Technology and system
Content and services robust
experimental, basic,
based on anticipated Content and services
need developed,
sophisticated and based
on demand and
requirement
Time

Fig. 7.1 Chart outlining how the robustness of technology, the 'sophistication' of content, the
heterogeneity of trialists, was anticipated to evolve over successive trial phases.

In this model, each successive system and content iteration would entail more
targeted development - as the technology would became more robust, the content
would more satisfying and richer, and the market (and new ways of attaining market
intelligence) would become more fully established. As the Trial developed so the
various types of user data would come into focus, dependent on the role of the user
(moving from their role as 'user' to a new role as 'consumer').

However, like many of the more technologically sophisticated trials, such as Time
Warner's Orlando Trial (see appendix 2), the unfolding of the Cambridge Trial
suggested quite a different picture from the anticipated sequence of Phases. The
development of broadband interactive television is not only technically sophisticated
and socially and organisationally complex - it is extremely expensive.

I felt as if the best plan was to consolidate my revised understanding of what was
actually going on in Cambridge, and aim to implement the user research at a time
which would reap more 'naive' users (Phase2).
2nd Visit to Om

A month or so later (15th of Oct), after hearing nothing from Om, I received a call.
Acorn had a consultant that they had used for various projects, and on viewing my
communications and plan of user-research, he had recommended that they offer me a
secondment. He was an advocate of QFD, and was aware of the importance of
consumer-user input into the project and recognised the lack of this within the
organisation. He also understood that i-Tv, being a radical innovation, raised
particular issues for design. There were also issues related to the provision of content
material and its evaluation. These represented new kinds of problems for methods
such as QFD. A comprehensive research approach was required which would not
only consider the technology, but the interrelations between technology and media
content. This meeting in particular had considerable influence in shaping my
thinking regarding the research.

I was unable to take up the secondment offer due to Om failure to provide funding. It
was simply not budgeted in the business plan, and I was unable for personal reasons
to move to Cambridge. For vetting participants for phase two of the trial I cut the
large media research questionnaire to a one page version. The sample for this phase
was to be chosen from a wider range of employees from the consortium companies
(i.e. those within and out with Acorn/Om), and it was initially considered that it
would be quite large. At this time I was focusing on phase 3 (the public) phase of the
trial which was to be the full market test involving paying members of the public.
Research on phase 2, was to serve then as a pilot for testing phase 3. I requested for a
stand alone demo unit to begin usability testing at Edinburgh.

3rd Visit to Om

I visited Om again on Thursday morning, 1 Dec. '95. I considered that I should bind
this with some interviews with trial participants from phase 1. I relied on my Gary
Nelson to check on the feasibility of this.

The meeting was attended by a selection of the Om management as well as their


consultant, David Byron. Within the meeting it was very clear just how technically
orientated Om was regarding their perspective in the trial. Both the consultant and
the product development manager were adamant that the real potentials for Om were
in the creation of the STB. The development of services they viewed as necessary but
viewed that Cambridge Cable would be eventually responsible for these. However, it
emerged in the meeting that Om was currently experiencing problems recruiting
participants for Phase 2 of the trial. They wished to make sense of the reticence of
employees to participate in the trial. They seemed to dwell on issues such as Om's
trial objectives, and making attempts to distance themselves from providing content.

I presented an outline and of my evolving ideas of contextual usability. During the


meeting the product development manager offered a good example of how usability
and usefulness relate to the notion of usage. He consistently took notes on paper,
when he (and all the other Om managers) carried Acorn PDAs. He said that he had
bought all the peripheral devices for it, and carried it everywhere but had never used
it, preferring to use paper. With respect to the trial the general feeling was that people
should pay for access to the trial, as previous i-Tv trials I had cited (such as the
Warner Communications QUBE trial in Columbus Ohio) had been non-subscribing
experimental marketing and technology trials.

It was obvious that there was no knowledge of the user consciously incorporated into
the design, or even in terms of the marketing potential of the system. It was clear
that to a certain extent that those implementing the trial had little concept of the
image that the technology possessed - even within their own staff.437 It was my own,
and the consultant's opinion that research into consumer-users should be started as
soon as possible. Possibly starting with research on consortium members to
understand their reticence in participating in the trial.

I put forward that I could begin by conducting basic usability testing on the stand-
alone prototype, and in particular the remote control. I also envisaged a series of
focus group evenings, advertised in conjunction with Cambridge Cable, where
437
Apparently people had heard that the service was poor and offered little in the way of
entertainment.
interested subscribers could come in as a group, and where we could run through
some experiences. I suggested that I would be willing to run these, perhaps with one
or two company observers, video taping etc. While the meeting illustrated the scope
and extent of problems which were going at this time, it also provided insights into
some of the internal tensions of the project, particularly regarding the passage of
documents to myself. These include the first draft of the user manual. The product
development manager was not pleased that I had received a copy of this.

Request for a stand-alone unit

My request for a stand-alone unit created further problems between various


management functions in Om. The problem was mainly regarding copyright of the
branding on the stand-alone model. Apparently there had been some problems with
one of the potential content providers regarding the presentations. They had
withdrawn due to a lack of satisfaction with the presentation of their branding on the
stand-alone system. They wished this content removed from the demo. This created
the hiatus in them sending out a demo to test.

Early in the new year of 1995, I received word indirectly from Om that there were
apparently about 60 volunteers for phase two of the trial. However, not all of these
were necessarily geographically viable (i.e. in areas suitable for connection). There
were limitations to who could be connected and who could not due to the 'squids' -
the kerbside switches which would only allow a number of homes to be connected
within a given radius. Cambridge Cable were evaluating which of the volunteers
were suitable to be connected.

The volunteers received an acknowledgement, but no questionnaires. I received a


communication from Gary Nelson a few days later (16 th Jan) which indicated that
they were now able to modify a STB for me to use. He also indicated that with
respect to recruitment they had now reached around 80 applications. Having said this
they were also likely to also be going out to the general public (or at least existing
Cambridge Cable customers) to economise on the connection costs. I received the
demo STB on the 23rd Jan. and commenced designing the pilot usability testing.

Phase.1 User Research Meeting

At the end of January 1995 I attended a meeting of the Phase 1 users. Much of the
feedback concerned the inability of the system to provide for a full movie, and other
technical problems which were noted by the attendance of Om design staff. Many of
the major problems encountered so far mirrored issues which were being raised
throughout the pilot testing (mainly the remote control, lack of positive switching,
redundant functionality, difficult to use for games control, desire for some on-screen
visual cue of operation, dislike of the double handed design to name a few). Most
notable were people's general reactions to the system as not being a particularly
radical innovation. They seemed to expect more from i-Tv, such as some facility to
control characters within programmes.

Pilot Test with Set Top Box: Feb. - March 1995

In February 1995 I began conducted my lab-based pilot study (on a convenience


sample of students and staff). Throughout the pilot testing I was tailoring the
procedure as issues and avenues arise. I conducted the lab-based tests with around 12
subjects, drawn from the student population of the psychology department.

One major problem that was tackled was the time the entire test took. With
administration of a battery of questionnaires and inventories the test was taking in the
region of 3 hours, This was clearly unacceptable for members of the public (who
may attend the sessions within their work or leisure period). I reduced this finally to
one and half-hours.

I was also at this time beginning to consider a larger scale testing of the stand alone
unit with a stratified sample drown from Edinburgh's population. I was very
surprised at the fact that there appeared no facility within the university for
producing a random sample of Edinburgh's population. Initially, I considered the
problem involved in developing an algorithm that denotes a random sample of the
general population, the sample being as geographically differentiated as possible.
This eventually gave way to a sample based on household constitution, and socio-
economic factors. The purpose of this was for a second leg of usability studies which
would be conducted with members of the general public. I did this in collaboration
with the data library within the university, and intended to send out one thousand
letters requesting participation in the study.

A report based on the pilot sessions was sent to Om around the 23 rd of March. It
detailed the outcomes of the pilot test, and gave indications of how I wished to
proceed. I had heard nothing lately regarding how the phase two trial recruitment
was proceeding. This signalled a period of relative silence, and a further crisis in the
flow of the work. The second time I planned to implement a research programme,
which had amounted to very little. However, it transpired that there had been
significant changes in Om as well. During this time conditional access to the users
had transferred from Om to the consortium responsible for content and services.

4th Visit to Om: Eric Donaldson

On the 16th May I took up an invitation to go down to interview Gary Nelson and a
new project manager dedicated to service provision - Eric Donaldson. He was to
oversee the development of content and services, while Gary Nelson was to manage
the technical dimensions of the Trial. It was suggested that He would be my main
point of contact from now on as he was to be responsible for setting up the user
groups of the service nursery. In the meeting he ran through the overall structure of
what had been going on over the last few months.

On the day of the meeting - the senior member of the management board from BMP
DBB Needham, an advertising agency who had joined the service nursery had come
to Om's HQ. BMP DBB were an advertising company BMP was founded in 1968,
and now the UK's 4th largest advertising agency. An international company, they had
been exploring the potentials of new media, both in their UK operations and their
overseas offices in the US. The BMP DBB manager expressed his keenness that I
were involved, and would pass on my details to his colleagues who were to work on
the i-Tv project.

BMP DDB

In early May I received e-mail from Gregg Rymes of BMP. He explained his
particular interest in the trial as gaining an understanding of how - or if - interactive
television can be used as a communication medium to meet clients' marketing needs
(in the way that, for example, we currently use television or newspaper advertising to
meet those needs).

As an account planner, he described his 'bread and butter' work was in developing
advertising which applied learning from consumer research. This included the tasks
of identifying the role for advertising within a given project, the appropriate message
and ultimately evaluation of consumer response. He indicated that such work is
necessarily very task-focused: specific clients have specific projects which in turn
require specific research. He envisaged that the same would apply in the case of the
Cambridge Trial: They would need to investigate the consumer reaction to each of
their client's advertising in a focused way, so as to ensure that we learn what makes a
good interactive TV ad, and what to avoid.

He further also had a wider agenda: to understand what place this new medium could
have in people's lives, and how they will use it. He was of the opinion that BMP
knew quite a lot about how people watch and 'use' conventional TV, but were
obviously in the dark with i-Tv. From a purely business perspective, it was
important for BMP DBB to get to grips with a medium which could potentially
supplant one of their major sources of income (i.e. conventional advertising). He
stated that at that moment, they were not "100% sure" of what they were going to do
within the trial. They had, however, committed a team to work on the project.

First service nursery management group meeting

There was to be an initial meeting of the management group later that week (May
19th). Eric Donaldson would propose my research to the group.
This meeting apparently went very well in favour of my participation in the user
research programme, and the user-research working group. Both the representatives
BMP DDB Needham and NOP were described by Gary Nelson as being "very
grateful" of the offer of obtaining help and input from my study. The main objective
of the user-research working group would be to develop a number of questionnaires
and to define the points at which users were to be contacted. Moreover, the group
would be responsible to ensure that the approach was co-ordinated so as not to
overload participants with too many interviews and questionnaires.

First user/marketing research working group meeting

The intention was that the user research/marketing group would have its inaugural
meeting somewhere between May 30th and June 9th, 1995. He suggested that I come
down and present a proposal to the group. The next I heard from him was on May 30
1995, when he phoned to say that I should not bother to come to the meeting, which
was the very next day. He would propose my work again to this group, and requested
that I faxed a one page outline. The result, however, was ambiguous. He had
presented the proposal, however he failed to get formal agreement with the
consortium regarding access to the users - "this secures your position, once we have
minuted agreement." He expressed that this was something he wanted to get resolved
and formally pinned down on all fronts 'ASAP'. At this time, it looked very much at
this time that I may be excluded not only from attending the meetings, but also
conducting trialist research.

More positively was the hint by Gary Nelson that it may be helpful if I and BMP
DBB could start thinking about the ways in which our research interests/methods
may be co-ordinated. I was particularly interested if they wished to run some test
advertisements. I wondered if they had put any ideas of how these ads will, look,
sound and interact. I wondered what were their pre-conceptions regarding who, will
interact, with what, and when? Their plan for the immediate future was indeed to run
some test ads, in fact this was "one of the main purposes" of their involvement in the
trial. At this time (May 22nd 1995) they only had a couple of rough ideas for specific
brands at present, because in many cases they were waiting to get approval from
clients to get involved in (and hence pay for) the development of interactive ads.

They claimed the possession of several research surveys showing that people's
willingness (and ability) to interact was heavily biased towards the younger, techno-
literate generation, indicating that confidence with, and usage of, technology is
greatest amongst the young. Their interests in the research seemed genuinely open,
and the notion of basic research into how people were making sense of the system
greatly appealed to them.

Gregg Rymes of BMP DBB seemed aware of the limitations of conducting early
research using demo and trial systems, he was aware particularly that "any research
at this stage may not give an accurate read of the longer term perspective." However,
he saw this as "unavoidable." He understood that "whether this is good or bad for the
effectiveness of the medium in communicating commercial messages would depend
entirely on the particular advertising objectives in each case." His outlook was very
much that;
"If you are trying to communicate a detailed, rational message, such as the
benefits of a particular life assurance company, then I would agree that
this message might well get lost as people are caught up in the sheer
novelty of clicking between screens and selecting icons. However, if you
were simply trying to create a leading edge, exciting brand personality for,
say, a soft drink, I'd argue that it almost wouldn't matter if every detail
didn't get through. The communication would be by association: 'this brand
is talking to me in an exciting and novel way, which makes me feel that it's
an exciting and novel brand'. In a case such as this, you could say that
the medium is the message."

My belief at the time was the need for basic research, viewing i-Tv as a
fundamentally new medium and domestic phenomena, and getting a handle on how
trialists would make sense of it. This would serve as a suitable background on which
to base assumptions of the effectiveness of i-Tv system components (in this case
interactive ads.) Some commentators have suggested that it was precisely the lack of
basic user understanding that delivered the death knoll on the domestic use of
Prestel.
Changing orientation of the user research

This discussion with Gregg Rymes of BMP DBB was significant for the research. It
now became apparent that working within this environment of PSPs, suggested a
greater shift towards the interpretation and relevance of content material of the
systems over and above any evaluation of its technical functionality. This changed
the orientation of the research, shifting the focus from usability parameters to a more
general consideration of how usability was situated within a complex of
interpretability and attractiveness of services, advertisements and interfaces. This
emphasised the contextual aspect of using the service - the core of the approach I was
developing during the 'silences' between Om and myself. During these periods I had
more contact with BMP and NOP. Their concerns were more focussed on
interpretation of content, and more general methodological issues than the usability
of the interfaces and hardware. However, Om did remain interested in usability
issues, and Eric Donaldson suggested that it may be in order for CITVIC to
commission the more lab based work.

I sent a communication to He and Eric Donaldson, requesting whether it would be


possible to have a contact, e-mail, phone or address for other members of the nursery
group. I mentioned that I had e-mail dialogue with Gregg Rymes of BMP DBB, and
would very much like to introduce myself to the other interested parties. While not
receiving a direct reply to my request, I did receive an invitation to the next user
research meeting scheduled for the 14th June. I was enthusiastic to attend the
meeting as my interest was now on following the development of group attitudes and
perceptions towards the user research. I was interested in any minutes from the first
meeting that was held at the end of May. However, it transpired that minutes were
left to the group's discretion themselves, He emphasised that they (Om) did not run
the meetings, they merely chaired them. Approval for any movement regarding
research implementation had to be confirmed by the group, and the only way to do
this was at the meeting. The reply was that I should get in touch with the Cambridge
Trial co-ordinator.
Science Museum

There was also an offer around this time to conduct research at an installation of the
latest stand alone STB in the Science Museum, as part of a BT sponsored exhibition
of new media. It was suggested that I visit a representative of the Science Museum in
order to co-ordinate some user research. This was difficult to organise within the
time frame which was given (I had about two weeks to do this), partly due to
excessive bureaucracy on behalf of the Science Museum (they had a considerable
protocol regarding research conducted within the premises). It led to a participant
observation of people using the system within the exhibition.

One of the most interesting observations was an addition to the demo on display. Nat
West had developed some content and this was included. It appeared that there was a
fault in the program, however, as when one accessed the service it became an
automated carousel possessing no functionality by which to stop or otherwise
navigate. It worked rather like switching on a promotional video. This
'disenfranchised' the user from any interaction, with the result that people reactions
were simply to move on and away from the demonstration. Another distinct problem
of arising from the particular idiosyncrasies of this display demo, was the use of an
alternative keyboard controller. This did not directly relate to the screen based
instructions, whose design was optimised for the Om remote control. This also
presented users with severe problems. This was an important commentary on how
the technical functioning of the system could most definitely impact more symbolic
aspects such as brand and company identity. Recalling the earlier problem when
certain potential PSPs were less than satisfied with the presentational qualities of the
system to showcase their goods (creating a content problem for stand alone
demonstration boxes), here was a case where functionality impacted image. People
would perhaps not blame Om or even 'interactive television', for the problems in
using, but rather Nat West or other service and content providers.

A final relevant point regarding the Science Museum exhibition - to my knowledge


no one from either Om nor the service nursery user research group went to see
people use the box in situation. It did present a unique opportunity to view people's
reaction to it en mass. It was the first truly public demonstration of the machine and
could have reaped relevant and interesting data for the service nursery and the
technology partners. The lack of capitalising on this opportunity did make me aware
of the different motivations regarding developing understanding of the technology
and content. And also the rigidity regarding how firms were considering the research.

The autonomy of the Cambridge Trial

Around June 1995, Gary Nelson was charged with the responsibility to produce a
business plan for the Cambridge Trial. I received a request from him for data and/or
articles useful for supporting his case for the growth of i-Tv. This seemed strange at
the time, as I thought that Om would have possessed such information. For instance,
they had an extensive press cutting service, and I had seen several reports on i-Tv
around. But apparently this was something they were missing. This was the first
indication of the breaking off of the Cambridge Trial as an autonomous development
and economic entity from Om. I sent some articles that contain some basic figures
from several market reports.

Meanwhile, the Gregg Rymes at BMP suggested that there were "striking
similarities" between my research objectives and the issues which BMP were keen to
explore. He was interested for feedback on a two page outline of BMP's research
objectives. These were relatively vague and seemed to echo my presentation at the
first user research meeting, and our previous conversations. He outlined an approach
in which they would wish to speak to a range of different household types, on a
number of successive occasions - very similar to the basis of my research proposal.
However, one significant difference was that they did not feel as strongly about ring
fencing households in respect to their exposure to on-line questioning, and other
techniques that NOP intended using to approach participants. He felt as if there may
be some capital in discovering people's reactions to this form of questioning. My
original intention (which was to be implemented on the phase 1 users, last Oct. - until
I knew they were all closely linked to OM) was to initially present user-consumers
with a ramped series of questionnaires. The sequence by which these were
administered were:
 'Filtering' questionnaires that would predicate possible candidates by
household composition, computer literacy and gender (in that order).

 The next questionnaire went into more detail concerning media


consumption and attitude to technology in general.

 The final questionnaire concerned issues related more to household


sociology.

The basic objective of these questionnaires was an attempt to 'crystallise' previous


and existing media consumption as well as attitudes, opinions and interests relating
particularly to media, leisure and the sociology of the household. The data gathered
by these questionnaires would be compared to further data gathered by more
qualitative ethnographic style interviews (over six months or the addition of a
significant change to the system) and substantiated by passively gathered data drawn
by system registration.

Second user/marketing research working group meeting

The second user research meeting was held at NOP's central London HQ. Present
were senior managers of NOP's media research department, as well as
representatives from Nat West, Om, the Post Office, and the BBC. I presented my
revised research plan (from the previous presentation to the Om management, some
six months earlier). This was well received, and encouraged the senior NOP manager
to offer the services of staff to help me conduct the project. However, it did seem to
encourage some resistance from the junior manager, who had self-appointed himself
as chairman of the group (originally chaired by Eric Donaldson).

NOP were to be responsible for the production of an on-line questionnaire which


would appear as a menu item on the service. They made an attempt to capture from
the group what questions would be relevant. This represented a complex procedure,
as most members of the group wanted quite specific sets of questions to be asked
regarding their business, and their particular proposals for content and service
material. Compromise was difficult to reach, and ended in the chairman indicating
that each organisation should only ask one question each. Clearly, such an approach
would severely limit the elicitation of useful knowledge.

I raised the idea that people were not joining the trial simply in order to be asked
questions. They had joined for reasons of curiosity, or some perceived benefit, such
as entertainment promises etc. I suggested that it was important to try and capture
general impressions of the service. This may indicate aspects of the service as a
whole which were strong and weak. Some may be technical (i.e. control and usability
issues) or to do with content (pleasing, informative etc.) or simply to do with the
'feel' of using. From this as background, more specific questions regarding aspects
of the system could be explored. However NOP's investment in the trial was
precisely to explore online methods of data production and collection. 438 They
appeared somewhat unconcerned regarding the content and substance of the
questioning.

Following this meeting, I contacted Gregg Rymes of BMP DBB on July 6 th and
indicated that I saw two levels of involvement in the research project; one was
conducting the research itself, and the other the other was the setting up of an
'interpretative group' which would analyse and develop themes from the transcripted
data. A senior NOP representative present, said that NOP had 20 or so qualitative
psychologists and offered a more senior person to work with or a less senior person
to work for me in conducting the ethnographic study. I suggested that due to the
elongated time span involved we should try and use people who can commit to
carrying through the project to the end, certainly from the interviewing position.
There was also a strong case for all ten houses done by the same team, the team
being small (no more than two or three), and being gender balanced (i.e. male and
female).

Later that day, I also contacted Om (Eric Donaldson) on to enquire on how the
438
It is perhaps more accurate to say that they felt confident that they would develop understanding of
the trialist’s attitude, as well as experiment with new forms of online market research:

“Our role in the Cambridge trial will be to help research the i-Tv users’ attitudes to the
services available. We will also be developing ways of downloading market research
question to i-Tv users . . . We believe this new interactive method of research would be a
huge leap forward for the industry and we want to be involved from the start.” Director of
Media Research at NOP.
recruitment was going for Phase 2. I mentioned that I had conversed with BMP DBB
Needham, and we were now wondering if the ten houses for the ethnographic study
could be chosen. I informed him of our intention of working collaboratively on the
project with NOP. He replied the following week to say that he had missed the last
Market Research meeting, and he was unaware of any notes taken from the meeting.
He was "completely unaware of any "ethnographic study"". No one from either BMP
or NOP had mentioned it to him. He further requested that whoever is most
appropriate to call him, if any action is needed from himself. So far he had had not
received any requests.

This last comment was indicative of the way in which my first point of contact with
the trial had now quite evidently shifted from Om to the user research group, and
how the dynamics of this organisation (particularly who was in control, who to
contact etc.) were becoming more hazy. Seth Paladopicous, the chairman of the
group, was either other indisposed or was not forthcoming in conversation. In a
telephone conversation with him, he reiterated that "there were many interests
expressed by the group." I interpreted this as suggesting that only the group when in
session was able to establish policies and actions. He further reminded me that those
companies involved with CITVIC had paid to be members and to conduct research.
He seemed little concerned that BMP and NOP qualitative were anxious to go into
the field. From his perspective the group had not decided to go ahead with any
research, bar what they (NOP Media) had proposed. NOP Media seemed to hold the
reigns on not only on user group agendas, but also on interpreting the outcome of
group business (which due to the lack of minutes seemed impressionistic at best). I
further learned that NOP qualitative were a separate division of NOP, and as a
separate department, were effectively getting little in the shape of payment for this
study. It followed that they had little motivation to work or commit to the project.

There seemed to be a further crisis looming. As an act of desperation to get some


movement and clarification regarding what was happening, I sent some of the
dialogues and have been going on the last two or so months between myself, BMP,
and NOP qualitative to my original Gary Nelson - Gary Nelson. In the light of
having poor intermediaries in the form of Eric Donaldson, and Seth Paladopicous, I
wished to explore if there was anything he could do to prompt action.

NOP Qualitative

As previously mentioned, I had already been in touch with Celia Smythe , who was
associate director of NOP's qualitative division. I asked The Gregg Rymes of BMP
DBB if I could pass on some of our discussion material in order that she could get a
handle on what we want to achieve. I further suggested that we have a meeting soon
to discuss this approach. I also wondered if she knew any more with respect to Phase
2 recruiting levels.

From her I gathered that the launch of this phase 2 was being put back to the end of
August 1995. She felt that this left plenty of time to get together over methods and
style of approach. I was becoming very conscious of how the organisational details
were beginning to dominate the user research in more ways than one. I was really
hoping that myself, NOP and BMP could open up communications, such as e-mail
discussions cross posted to each other. Here we could discuss issues regarding the
qualitative research programme.

BMP indicated that they were "very willing" to join forces with myself in the 'ten
households' project, and keen to open dialogues. They agreed that it was perfectly
fair to ask for consistent involvement of personnel throughout the trial (With the one
proviso that BMP's position as sole advertising agency participant is limited, initially
at least, to twelve months.) Gregg Rymes of BMP DBB was sure that NOP would
wish to get involved, particularly in their capacity as co-ordinators of the market
research subgroup. However, he was not so certain at this time whether they would
actually 'get their hands dirty' with the fieldwork itself - they are mainly a
quantitative research agency. He mentioned that he and his colleague may well also
wish to supplement the 'ten households' project with further ad hoc qualitative
projects to address particular issues, but if so, it obviously need not influence our
continuous project.
He also mentioned that we were now at the stage where we should try to arrive at a
definite plan for conducting the qualitative research. He wished to know if I had
progressed things further since the last meeting – i.e. had I a set of ten names and
addresses? Additionally, they asked if I proposed to draw up a discussion guide for
the first session. They were very keen to agree the practicalities of how we would
conduct the research - if for no other reason than to set time aside in our diaries. It
was difficult to set this time, as it was becoming obvious that the circularity of the
power relations which were forming, were protracting relatively simple tasks.

He informed me that NOP had devised an 'establishment survey' which he felt was
very similar to my own "media and leisure questionnaire." This was also designed to
be sent to participants before deployment of the system. He presumed that I would
prefer that to be used instead of NOP's survey, in order to keep the households'
research burden to a minimum.

Group communications

As indicated, there was a distinct picture emerging that the group communication
flow was being greatly hindered during this time due to people constantly 'passing
the buck' to others. Gregg Rymess on phoning Eric Donaldson at Om to get an
update to find out how recruitment was progressing, had now been referred to NOP
for that information. There he spoke to Celia Smythe who in turn had to speak with
Sidney Green/Greg Paladopicous to get an update on the progress of recruiting
trialists, and hence to give us an idea of likely timescales for conducting the
fieldwork. One other, and most significant event from my own perspective, was that
Gregg Rymes of BMP DBBs was of the understanding that that Celia Smythe was
proposing that NOP provide all the interviewers. From his point of view he supposed
that this would "certainly save us the hassle of repeatedly travelling to and from
Cambridge!" From my point of view this became ever a further threat to my efforts
to access users.

I received word from BMP (11th Aug) to the effect that NOP seemed to be
suggesting that they conduct the interviews. I emphasised the point that I sincerely
hoped, in the light of having no direct access to users, that my request would be
granted for the interviews to be audiotaped and verbatim transcripts being made of
each of them. There had seemed to be some reticence on the part of Celia Smythe to
do this because of resource and expenditure implications. As mentioned, NOP
qualitative, as a separate department were not being paid for this work (from Greg
Paladopicous). She seemed to prefer a looser 'real-time' note taking procedure. This
involved interviewing people very casually and 'lifting' out relevant points or
statements as they emerged. Such a procedure would severely restrict possibilities of
alternative or secondary analysis. I offered to transcript the interviews, and requested
that at least they be recorded. It may be that NOP qualitative's suggestion of note-
taking may provide the level analysis required for business use, but recording the
interviews leaves open the potential for a 'deeper' readings.

Gregg Rymes of BMP DBBs and Wilkins of BMP met with Celia Smythe on the 15 th
Aug. According to Gregg Rymes of BMP DBBs, it sounded like the recruitment of
trialists was making very slow progress. Om had only 63 trialists signed far, far from
the initial target of 100. NOP intended to 'ring fence' the first hundred for the
quantitative research programme (the online questionnaires and sys-log data). This
suggested that the qualitative research sample would have to wait for extra
households over and above the hundred. At the time exactly when this would be was
not at all clear - potentially, it could mean that we have to wait until late September
or even October. Celia Smythe also indicated that Om were now aiming for 250
trialists rather than just 100, so recruitment would be constantly continuing.

Gregg Rymes expressed his hope that our having a dedicated qualitative sample
would mean that our respondents could also be ring-fenced from external influences.
Out of the meeting they felt that an ideal number of participating households would
be 16; an acceptable alternative would be 12. These would be divided between a
spread of household composition types (determined by NOP qualitative):

3/4: 'pre-nesters' (i.e. singles or childless couples)


3/4: households with kids aged under 10
3/4: households with kids aged 10-16
3/4: "empty nesters"/households with grown up children
They also hoped to include within these a mix of households who had been given
cable TV channels for the first time as a result of participating in the trial, and
households who already had cable or satellite. The expectation was that there would
be a significant difference between the attitudes and perceptions of i-Tv between
these.

Early analysis suggested that those who had already been recruited were heavily
biased towards the AB socio-economic categories. Apparently this was unavoidable
since the existing cable network passed through mainly upmarket neighbourhoods.
As a result, the sample was likely to be skewed towards more affluent households
rather than representative of a wider television public.

As regards the practicalities of arranging this project, the following logistic


suggestions were made. NOP will provide two research executives who would cover
at least half of the interviews within each wave of the research. The remainder was
to be conducted by Celia Smythe herself, plus myself if necessary. I would be given
an option either to accompany people on these interviews, or conduct some by
myself. Each interview would take place with all those members of the household
who had used the i-Tv system, i.e. adults and children together. All interviews
would be audio taped, and I was to be sent the cassettes for transcription.

At some stage all the participating researchers would need to meet up for an
analysis/interpretation meeting. It was understood that I would want to produce my
own forms of report on the basis of this work, while NOP and we might decide to
summarise it more simply. NOP was also to provide a 'recruiter' - Phoenix
Fieldwork Ltd. They would contact those households that are allocated to us. They
would ask them to participate in our project, and arrange the times for the interviews.
It was also suggested that we send each household a very basic viewing diary for
them to complete during the week prior to each interview.

At a further meeting Celia Smythe made one or two observations, which was
circulated via e-mail. These were;
1. We need a clear statement/list of objectives to work to

2. We discussed viewing diaries, are we going to go ahead and use


these at all?

3. Resources, a bit of confusion here

a). NOP were able to cover off approx. 6 interviews per wave using
executives and this includes any that she may be able to conduct herself. The
remainder will have to be completed by other project team members. Bearing
in mind this is likely to be a 3 wave project she felt that the '6 interviews per
wave' offer is a considerable commitment of resources bearing in mind cost
implications and budget.

b) Recruitment. NOP can recruit the initial sample, but she felt that
subsequent appointment making and scheduling needs to be done by
myself so that fieldwork is controlled from a central point and timing
is kept a careful check on.

Do phone or send e-mail if you have any queries. I've copied this
message also to Derek

Greg Paladopicous had also informed her that the next research sub group
meeting will be on 13th September at 10am at NOP Covent Garden.

My feeling with all this was that we were making some progress, but I wondered if
there really was a need to segment the small sample, especially since it is not going
to be the same researchers doing every participant for each wave.

On the 1st Sept. Gregg Rymes replied to reiterate that it we would have to wait until
Om recruited 100 households before we would have access to anyone. His perceived
logic of this is to avoid overloading the trialists with a huge research burden "bearing
in mind that the first 100 will continually be interrupted with on-screen
questionnaires, as well as being roused from their slumber at 2 am to explain why
they watched an adult movie!"439

439
He was referring cynically here to a claim by a NOP representative, that they would consider
phoning people while they were watching programmes to ask them how they were enjoying it etc.
This was contested at the meeting by the Nat West representative who remarked that he may be
watching adult movies with his friends on a Friday night and if NOP phoned him then he would not be
happy. This is a significant point to note. It indicates something of the differing relations which
various firms had towards what was morally and ethically permissible to do with the trialists.
2nd User research meeting

Gary Nelson wrote to let me on the 10 th of Sept. to inform me of the next


user/marketing research meeting. It appeared that my name had been omitted from
the list of participants and he wished to find out if I had heard about it. I mentioned
that I had via the people at BMP (who had in turn heard of it through Celia Smythe at
NOP).

The members of this meeting included representatives from NOP; Post Office;
Tesco, BBC; NatWest; Anglia Multimedia; BMP; Education Online (Acorn); ITC.
Leading up to this meeting I was contacted by Seth Paladopicous to say that he felt
that I would not get much from the meeting, as it would be concerned chiefly with
quantitative material. I indicated that I was interested in all aspects of the user
research, not only the qualitative work, and was interested in attending. The meeting
convened with people stating the names and contact addresses of the various trial
partners. Michael outlined what NOP had done so far in terms of 'research
development' and distributed packs that consisted of some 20 or so sheets of
statistical breakdowns pertaining to each of questions asked in their 'background
study'. The 66 households were represented within the study as both numbers and
statistical proportions.

From this data, an immediate problem was recognised by Nat West. There were few
trialists which had accounts with their bank. 440 This seemed on the surface something
of a fundamental point that perhaps should have been addressed at the recruitment
phase. Trialists, to use the banking service (and of course to provide feedback on the
quality of that service – the motivation for any PSP participating in the trial in the
first place), would have to change banks or open a new account. People have
particular relationships with banks which would open entirely new set of questions
regarding behaviours and actions.

More generally, there was consensus regarding the entire group's commitment to sys-

440
“Our primary focus is to understand how consumers will take to this type of banking” Stuart
Chandler, Nat West’s deputy chief executive cited in CSCI March 1995
logging as an effective means to generating the participants’ reactions to the system.
This was potentially a way of storing, processing, analyzing large amounts of
valuable personal data and user behavior data. If this system became standard then
the potential audience and user populations would be enormous. However, there still
seemed considerable work in terms of building the software tools which would be
able to analyse the raw data which was being produced by the system. Nat West also
raised some questions regarding the security issues, and that logging data pertaining
to their service would have to carry guarantees of privacy. They did not want logging
data regarding their service to be public shared knowledge, even within the user-
research group. The question of the qualitative research came to the fore, in terms of
it helping to augment information produced numerically by the system. There
seemed a unanimous opinion that the qualitative research should be implemented
ASAP.

In comparison with Greg Paladopicous 's emphasis that this was a meeting dedicated
to quantitative research, the meeting was dominated in an interest of what trial
participant's reactions were generally to the i-Tv system. It was felt that the
qualitative research offered the most immediate avenue to acquiring this and the
feeling was that this should progress, ASAP. Greg Paladopicous indicated that NOP
were looking into this, and would send information out with respect to how it was
progressing. The Om representative indicated that it was unlikely that they would
raise anymore than the 66 trial participants, they had simply run out of funds. In the
light of this, I advanced to the group that I was ready and willing to go and interview
at any time, and that we should not have to wait for quotas of participants to be
ringfenced for purely quantitative studies. In the light of developments, this appeared
welcome.

By the 2nd Oct I decided to get in contact with Gary Nelson again. I had phoned him
repeatedly and left messages on voice mail. I mentioned the outcomes of both the
previous user/marketing research meetings and the apparent agreement to conducting
the qualitative research without waiting for 100 participants. This communication
had some positive effect. He had spoken with Eric Donaldson, who had subsequent
talks with NOP and BMP DDB. These discussions had clarified the outcomes
somewhat. I was to contact the group through a further person - Louise Edgley at
Online Media, who was handling "recruitment and customer care."

From his understanding it appeared that NOP and BMP DDB were relying on myself
to organise the research group and arrange the interviews. Although perplexed with
this reply, I stated that this was perhaps a good idea, since I probably had the most
full time involvement. In respect of the question of interviewing users before being
connected, he said that this was "trickier, but probably OK". This was something
that I had to speak with Louise about.

It was suggested that it would be helpful if I produced a plan for the Qualitative
Research activities, with the following sections:

Project Title

<title

Brief description

<brief description of activities and objectives

Motivation

<main reasons/incentive for the project

Technical approach

<overall technical approach of conducting the interviews, when, how


often etc.

Deliverables

<detail key deliverables by way of research reports, enhanced questionnaire


etc.

Outline Schedule

<key milestones with dates


Resource Requirements

<identify personnel involved (name, organisation), and any other materials


or tools required to implement the project

(n.b. The above project plan should only be about 2 A4 pages, but I believe
this will help to focus the activity and will be extremely useful to present a
solid project for this activity to the PSP partners.)

I wrote back the same day confirming that I would produce this document ASAP.
However it did seem to be structured rather like what had already been sent several
months before. I was more optimistic that we were indeed witnessing movement.
However, phoning Louise Edgely I found that she was very difficult to get in touch
with as well. The next communication was from Gregg Rymess (Mon, 23rd Oct)
saying that he understood that Eric Donaldson at Online Media had now given
formal approval for the qualitative research with trialists to begin. The next step was
presumably to contact the trialist households, arrange convenient interview times.

Speaking later that day to Eric Donaldson I was reminded that any decision to move
on this issue was up to the group, who after all were paying members of the service
nursery and who were, by this time, largely funding the trial. It was clear that I would
have to get the endorsement of BMP, who were subscribers to the nursery to then get
in touch with Donaldson. Donaldson then sent an E-mail to Seth Paladopicous to say
that he had chatted with someone at BMP that afternoon regarding the initial "first
reactions" qualitative research that was agreed on at the last Research working group.
He suggested the following action plan:
"Louise Edgley at Om has already sent all of the dates that households
finally went "live" to Melanie Cook (at NOP).

If John, Laura and Derek decide (based on whatever criteria they wish, but
including a minimum period of having been live) which families they want to
talk to (10 was suggested), and some possible dates for the interviews, and
tell Louise.

She will check to ensure that the families haven't e.g. just had 3 press
interviews in the last week (it's starting to happen!!) and co-ordinate dates
with the families - most of them now know her.

We have also now had a number of comment forms back from users - these
can also be made available. If any of these refer to particular PSPs services,
then they are automatically passed on, but so far most are pretty generic.

I suggest someone (Derek?) examines these on behalf of the group and


produces a written summary/analysis - rather than circulating them to
everyone."

The next contact was the 29th Oct, when I again spoke to Eric Donaldson by phone.
He informed that I was no longer welcome to turn up at user/marketing research
board meetings. Apparently, "several members" had expressed their concern of the
way in which I was "holding up the research process". This was contradictory to my
experience in the group and direct communications with some of its members (BMP
DBB Needham and NOP qualitative). He indicated that NOP's Seth Paladopicous
alone that had been responsible for the allegation. His reply was that I would be
involved with the research, but that the best plan just now was to let the group go on
its own way.

On the 9th Nov I received a copy of the of the mail which was sent out to Research
WG members;

QUALITATIVE RESEARCH
====================

It was confirmed at the Research Working Group meeting on 1-11-95 that


qualitative research should go ahead as soon as possible.

I agreed that I would detail the approach we discussed, as follows:-

NB - Part of the process is designed to ensure that the identity of families


c.f. their detailed sys-log data is never revealed to PSPs.

1) NOP to supply PSPs with anonomised demographic data on all families


whose systems are now live.

2) Om to augment that data with dates on which the system went live for
those families.

3) All PSPs to advise Jon Wilkins of BMP of the demographic (or other)
basis on which they would prefer to select (say) 10 families for a face-to-face
interview, as well as the issues they would like to have investigated during
these sessions.

NB - No target date for this was agreed at the meeting, but I suggest that
everyone gets their input to Jon no later than Friday 10th November. As
usual, no objection implies consent.

ACTION - All Research WG members

4) Based on this, Jon, Derek Nichol (Edinburgh University) and Celia Smythe
(NOP) to produce a *very* sort outline of how they propose to carry out the
interviews and what they hope to illucidate, for a quick "OK" from the
Research Working Group.

5) They select target families from the data supplied by NOP and Om (by
code number) and pass to Om, who will (a) check that there is no reason why
a particular interview should not be done (only in very exceptional
circumstances) and (b) coordinate arrangements with the families.

6) Interviews to be coordinated between and conducted by Jon, Laura and


Derek, who will report back to the Research WG ASAP after completion.

Eric Donaldson
9-11-95
[note; the above is a direct copy of the e-mail received and includes original
typos]

Will Colin got back in touch to enquire if I had received the e-mail sent 3 days before
(dated last Thursday, 9th Nov). This apparently gave the go-ahead for the qualitative
research once the PSPs (i.e. Tesco, the Post Office, etc.) have given their views on
the types of people they would prefer us to talk to. He said this should happen by
this Wednesday (the 15th Nov). They also were indicating their frustration with the
lagging research process; " . . . perhaps we might be starting to talk to people at long
last."

There was no reply to this request and Jon Wilkins had a further meeting with Celia
Smythe on Dec.7th. Again, to discuss how to get the interviews underway. Reiterating
my points which had been sent out some months ago I sent a letter detailing what I
considered important in the approach to the users. I stressed the requirements for the
sample, and suggested the contextual usability use categories as a blueprint regarding
the questions asked in the interviews.

The sample

On the 14th of Dec, I received a copy of the mail sent to Jon Wilkins at BMP from
Eric Donaldson. It basically detailed Om's suggestion of trial households. At this
time they had not be contacted. Louise Edgley at Om, was to schedule the interviews
for us. The number donated the identity of the household.

F002 2 adults, both working. No children at the address.


F010 Couple, 2 children (1 boy 1 girl). She - solicitor, he - local Councillor
F011 Couple. 3 older children 1 of which is studying.
F016 Couple, 2 children.
F018 Young couple, not married, both working
F019 Single male, working.
F025 Couple, both elderly & retired
F028 Couple, both elderly & retired.
F033 2 adults, single, studying
F040 Couple, 1 child age 13
F071 Couple, Both working.

On the 19th Jan 1996, Eric Donaldson informed that they had not started the
interviews as yet. "but they should be close...!". He also had an interesting
proposition regarding the SYS-LOG data. They had been "pouring over the
voluminous stats" that they had been collecting, and he wondered whether I would be
interested in "trawling the data to try to extract behavioural meaning?" He wished to
"talk through the implications - like do we have all the tools we need, data
protection, etc." He attached an e-mail to illustrate what he was thinking of:

Fri, 19 Jan 96

Imam,

How much work to put something together to produce a sort of pseudo-replay


facility, that runs through the sys-log file and puts together a storyboard
for a particular STB, e.g.
Family number 20045

12/11/95 10.34:23 Box booted 1:05


10.35:28 PIN entered 0:17
10.35.45 Main menu 0:04
10.35.49 Leisure 0:09
10.35.58 Go Fishing 12:23

etc etc

Last column is the time in minutes and seconds spent in that level or video
etc.

I know you don't have all the info implied above, but we should be able to
get fairly close.

What do you think? Would be very useful in examining the bahaviour of a


particular family.

Alan

PS - If you can crack that, how about a real replay facility!!!!

This was a significant development within the user/marketing research story, as it


seemed that now I was also to be presented with the quantitative data of the trial.

Eric Donaldson had checked Om's proposed list of families to interview as requested
at the 4th user/marketing research meeting in early January 1996. They did seem to
represent a good spread of usage from heavy to light. Minimum non-zero monthly
usage was in the range 50-100 minutes, maximum was over 2,000 minutes. He
recommended that we moved ahead with interviews using this list (assuming Louise
could persuade them all to agree). He also requested that we decide how and when
you would like to do the face-to-face research? This was now to be co-ordinated
through Jon Wilkins as agreed at the research meeting. Jon was to get back to He
ASAP with the group's view?

There was to be a fifth meeting of the user/market research group on Tue, 19th Feb
96. This was where Eric Donaldson was to ask the group's permission for me to
conduct the analysis of the quantitative data. However "the buggers cancelled it at
the
very last moment!", and this entailed a further hold up on that front. However he was
going to fax them the next day and say "I intend to do this - react by xxx or else."

By the end of March there had still been no movement on this (as there had been no
research group meetings). However, two Om staff Alice Hodges and Imam Khan had
finished off the software to analyse SYS-LOG data. Alice had been testing it out
from the point of view of "what can we infer?" And the "answer seems to be - quite a
lot . . . she was running an interactive workshop for some of the companies today,
and that's shaping her ideas on the best way to go. I'll get her to give you a call in the
next few days (to brief you on experiences and directions to think in) and see if we
can get the data and tools shipped up to you when she's despatching the next batch to
the partners." I never received any phone call, nor did BMP receive any of this data.

The next communication was in May, where again Eric Donaldson indicated that
Alice Hodges will be sending me the tools and the data shortly - she will call you. It
was also stipulated that before she despatches these, there was "a few conditions
you'll need to "formally" sign up to for commercial and data protection reasons." Eric
Donaldson indicated that he was leaving Om shortly - and that Alice would be my
main point of contact from now on. To date I have several phone conversations with
Ms. Hodges, who has indicated that she must secure permission from her up-line
manager at Acorn to send me the data.

The user research

The qualitative user research was finally conducted in collaboration with personnel
from BMP and NOP Qualitative. Participants on the Cambridge Trial, were
interviewed between the 23rd and 24th of July 1996. The 11 (of an intended sample of
12) households were selected from the 66 participants in the trial. Phoenix Fieldwork
Ltd handled recruitment. The shortened transcript interviews and their analysis is
included in appendix 1.

The objective of the qualitative user research was to understand the trial participant's
understanding of the technology. To uncover something of the way in which
participants came to learn of the trial and the technology, and of their interaction with
the content and services.

The sample was recruited by a London-based fieldwork recruitment agency used by


NOP - Phoenix Fieldwork Ltd. These were responsible for contacting those
households that were allocated to the qualitative study, asking them to participate in
our project, and arrange the times for the interviews. In the case of my interviews
they had unfortunately failed to confirm the interviews on the day, thus on arrival at
one of my households at the appointed time the interviewee was found to be out. The
other interviews, although completed, had to be rearranged from the scheduled times.
This was because people were not available due to their own rearranged plans,
family in hospital, and one household busy making dinner for their lodgers.

The 12 household were evenly divided equally between the three participating
research teams -myself, NOP and BMP DBB. We each had a discussion/checklist
which was drafted by myself and BMP DBB. This was of importance in this study
due to the use of multiple interviewers. Interview guides provide some element of
standardisation across interviewers, where their approaches, manner, appearance,
communication abilities etc. may vary (Robson, 1993: p236). The guide had emerged
from sharing of issues (see previous chapter), and substantive work I had done in the
way of questionnaire development. The intention was that this would be
implemented in a open ended way to promote discussion of the use of the system.

In each case, all the of the family present were presented with the questions. Answers
were spontaneous and sometimes negotiated and/or contested. Most households had
been contacted by Om in April 1995, and connected in September 1995.

There are several themes recurrent throughout the case studies. Broadly these can be
broken down thus:

 Lack of content drove inactivity with the system.


 Regardless of problems with content most people still saw value in such a
service providing there were programmes which would appeal.

 Most people saw advertising as inevitable, however interactive


advertisements were difficult for them to grasp or imagine. As a concept they
seemed to appeal, providing they did not interfere with the programme.

 i-Tv would not impact on their choice of viewing, rather the flexibility it
afforded (i.e. on-demand programming) would enable them to view when
convenient dependent on other leisure time activities.

 It was quite obvious the family homes differed from each other in terms of
uses for television, and that these homes differed form their extra-television
activities, in ways which would be relevant for the consumption and use of
particular services (i.e. some services simply did not 'exist' for certain
households).

 It was quite clear that interactive radio was of little interest to interviewees

The qualitative interviews conducted with trialists reaped little in the way of directly
relevant data suitable for innovation either of content or technology. 441 This is why,
rather than accented within this study, they have been relegated to an appendix
(appendix 1). The main problem was clear and straightforward - there was a lack of a
cohesive service, useful or entertaining enough to motivate use. A symmetrical of
notion of evaluating design (intentions) and use (interpretations), was completely
unbalanced by the multitude of intermediate technical, financial and organisational
problems that plagued the trial.

Conclusion

The changing contingencies of the trial - who had stakes in what, their own drives
and motivations, their overall interest in making the trial work, and what their
particular learning objectives - impacted the implementation of the research or the
crystallisation of the working 'images' of technology and users.

This chapter showed the importance of understanding the contexts of organisational,

441
This is not to say of course that the study was redundant within itself. It could have provided a
more substantial basis for further research taking into account wider perceptions of media and
domestic technology.
administration and management aspects of trials. What eventually evolved as user
research from the constituency of the Cambridge Trial was definitely hindered by the
institutional structures that emerged. Generally, the order of the day was 'learning by
doing' or 'learning by struggling'. This was practised with respect to technical as well
as social innovation on the trial. Lack of proper governance of the trial, consisted of
little more than ground rules produced by Om's Eric Donaldson for the working
groups.

One of Om's main priorities, for instance, was making the trial economically
autonomous, whilst simultaneously developing diversified markets for their
technology (including the RISC PC as the sole means for authoring i-Tv material).

Nevertheless, there seems a rich lesson to be learned in articulating the social


elements of their constituency building effort, in addition to their technology
development. However, this has often no resources afforded to it, and at times in this
particular case it appeared as if Om were treating the social construction much in the
way in which they would develop software system architectures. However, social
solutions leading to the successful development of new media partnerships seem
more tied to existing structures, and considerably less viscous in their movement,
than the tremendous pace at which technology solutions appear. Markets and
business and social solutions are also more complex, than many models suggest. This
seems particularly important in the manufacturing of new mediums, where the
technology is an arbitrary, or at the very most facilitating part of the overall
usefulness and value of the system.442

Innovation often results as a unique configuration of many parts and functions, which
independently may have been developed for entirely different end-purposes and uses.
Many firms are working towards a 'mass' age of new media. However, the

442
This evokes notions of the debates that continue between those advocating that ‘technology is king’
over those others who claim that ‘content is king’, or other still who see that ‘connectivity is king’.
My own view follows Saffo (1993) who claims that ‘context is king’ - i.e. it is all the elements which
needs equal attention in the creation of viable new media. There is little benefit in having a top quality
hi-fi amplifier if one only possesses a cheap pair of speakers. Likewise having a powerful STB is of
little benefit if one has no content, or one cannot connect enough homes to persuade major content
providers to deliver.
Cambridge Trial was a particular socio-business experiment. Each trial, even if it
were using the same technology could be entirely different in nature, reaping entirely
different results based on social and cultural contingencies. While trials may have
reasonably clear and well-defined objectives, they remain technically and socially
experimental. The elucidation of results and the ways to achieve them seem often ill-
defined. Even in the early experiments in i-Tv, such as Winky Dink, cited by John
Carey (1996), there were problems with children drawing on the TV screen directly
rather than with the pre-pared screen cover they were 'supposed' to use.

Simon (1969/1996) draws attention to the question of how a simulation can generate
new knowledge. Simulation is used to achieve and predict the behaviour of systems.
To a large extent, and as was implied through Om's focus on technology in chapters
5 and 6, the users were viewed almost as intelligent parts of the system. They were
viewed as data generators, from which inferences were to be made regarding
tweaking the system, its look, its offerings, its functionality and so on. The trial
content was only ever a demonstration. It had many promised features that never
materialised and this was the single most represented piece of feedback consistent
across all interviewees. Simon relates two assertions about computers and
simulation:
 A simulation is no better than the assertions built into it.
 A computer can do only what it is programmed to do.

Applied to the Cambridge trail this may be taken to infer that the point of view of
Om, and even the companies involved in the trial, that since it was only a simulation
- a demo - that no useful information could be drawn from research. The user
research has offered a glimpse into the often non-rational, very diverse and
individualistic lifestyles which are recognised within consumer research, and only
now being recognised by those who developed technology which is to be situated
within domestic locations and real lifestyles.

Erlandson et al (1993) point out that in naturalistic research analysis is continuous,


that the "analysis of data interacts with the collection of data." (p.130) This suggests
the flexibility inherent in this approach. Subsequent interviews are shaped by what
has been learned from previous encounters; interviews in process may change with
respect to what is being offered by the 'co-researcher' – the 'subject' of intreptist-style
studies. New opportunities for data collection are sized upon as they occur and are
considered relevant. Such an organic approach maximises the research process
within real world and often chaotic circumstances. Research and analysis are never
fully complete, there is always something, some angle which was not fully exploited
or explored. This was the case in the user research of the Cambridge Trial. The
research process was in the end compromised. The reasons for this compromise
maybe summarised thus:

1. The employment of a semi-structured interview schedule. While this permitted


some degree of standardisation of the results, it also challenged the notion of
natural trajectories within the interview process. As such they were guided to
irrelevant questions (such as asking about their impressions of advertisements
which were not on the system), and generally swept along with the pace of the
questions as laid down by the schedule. There was some evidence of answers
being 'invented' or 'forced' for questions.

2. There was a definite lack of consistency between the interview styles of


interviewers. This resulted in some interviewee answers being closed down on
points they perhaps wished to emphasise, possibly due to the interviewer's notion
of what was relevant or non-relevant to the discussion. Different interviewers
have different ways in which they communicate with people in intimate places
such as their homes. It is quite easy to imagine that some researchers have
particular talents for making interviews feel relaxed and open, free to present
their 'genuine' impressions on phenomena, whereas others may unconsciously act
to inhibit the free flow of thoughts and feelings regarding subjects. This is an
unforeseeable problem which none the less must impact much of human subject
research, and is itself an artefact of insurmountable individual differences and
experience.

3. Logistical difficulties plagued this project due to the inclusion of a third-party


firm for arranging interviews. As noted I experienced difficulties (fatal in one
instance) with my interviews, and it was only down to luck and the flexibility of
myself and the interviewees to reschedule and fit in the interviews on spec. It is
not unreasonable to imagine myself flying down to Cambridge on that day, only
to return with no interviews whatsoever. The use of such interviewee recruitment
agencies seemed common practice to NOP, who obviously use this company on a
regular basis.

4. Semi-structure interviews presume something of the communicative abilities of


the interviewees. Those who are more 'vocal' and can articulate in a much more
richer way than others may tend to dominate at the level of analysis, particularly
when this is done at the casual level. What was indicated from NOP was that the
interviews would not be subjected to transcript, and that for their purposes it was
only necessary to lift out sentences taken from listening to the recordings. Such a
method may leave itself open to reporting on the feedback from certain
interviewees over the subtler, but nevertheless relative, feedback of less articulate
or outspoken interviewees. Such a problem is of course framed within the larger,
more pervasive difficulties of the interview process as a social science research
implement, but attention to questions of interviewee articulation should perhaps
be made to frame each of the interviewees' responses. This could be derived from
realising the benefits of a more discourse rather than simply content orientation at
the analysis stage.

Marcus Penny viewed that within the relations of user to marketer there lay an issue -
that of bringing the user's interests into this in an appropriate way. In particular the
issue that Penny faced was financing some consideration of the user's interests at all.
Everything that he did had to financed and justified in some way. At that moment he
was financing and justifying it on the basis of service providers and Om, learning
from the process of the trial and the lessons that this taught in terms of how to build
businesses in the future. What value was there to service providers in consumer's
interests? Who were the institution or firm who would pay for this?

One could imagine that a public sector institution such as the Independent Television
Commission (ITC) or some of the consumer watchdogs could become interested.
They may be better placed to conduct independent research. Penny felt that this is a
generic problem with all products and all services in that the users in the end do not
finance it. At the creation stage you have got to deal with the people who are
financing it, these are the people who are actually interested and engaged in its
development at this point. They are the people putting their time and effort and
investment in on the basis that they produce services to users. Out of that there is a
motivation for them to get a real understanding of what the users actually want.
Penny thought this is something that you could sell to them if they understood what
users really want they will do better in the provision of services.

This seems to be one shortcoming of group decision making processes that is classic
- some processes apparently give rise to spontaneously good products as was the case
of the original demo STB. A worst case scenario is also possible however where you
have got a bad product which fails to satisfy both the collective needs and individual
needs of the group. This may be true, as in the user/marketing working group case
where the product is a research approach the difference is this product is knowledge
and not a purely technical system which either works or not.

The notion of 'users' and 'consumers' were an inextricable part of the transactions
which took place between the original project team and senior managers and funders.
The former being convinced that there were indeed latent mass demands for
interactive services, while the latter felt that only a trial could illustrate fully the
technical potentials and credibility of the technology and concepts. Users featured
strongly again when they became collateral in the transactions which took place
between Om (and those responsible within Om for the trial) and potential PSPs.

Bounded with the notion of developing and learning core competencies needed for
providing interactive content and services, PSPs invested in order to learn of the
organisational problems involved with trials and also to learn of what 'average'
consumers would make of the system:
"The presence of NOP (National Opinion Polls) on the Trial has facilitated the gathering of
detailed user feedback. The initial data showing usage of services by Trial participants, along
with their reactions to their experiences constitutes a goldmine of information for other
companies wishing either to participate in other i-Tv Trials or to provide content or services .
. . Indeed as such it allows the consortium to evaluate the revenue potential of such services
for roll out in a wider context and even for eventual commercial deployment on a regional or
national level." (Om promotional literature)

Identified as a crucial part of the learning process of the trial, was for firms to
understand and gauge the impact of their individual presence on the system. As such
the 'public' stage users (as opposed to the designer-users) were to a degree
'commodified', as user access and research was added as part of the value enticing
companies to join the service nursery in the first place. Access to such information
was unfortunately mediated by a dysfunctional group which was perhaps indicative
of some of the deeper problems of information flows, management and governance
involved with the trial as a whole. Clearly, there was not enough effort (or probably
resources) in building the sociotechnical constituency of the trial.
Chapter 8 – Conclusion

An exploration of the basic processes of product and service development


show that users and manufacturers tend to develop different types of
innovations. This is due in part to information asymmetries: users and
manufacturers tend to know different things. Product developers need two
types of information in order to succeed at their work: need and context-of-
use information (generated by users) and generic solution information
(often initially generated by manufacturers specializing in a particular type
of solution). Bringing these two types of information together is not easy.
Both need information and solution information are often very “sticky”—
that is, costly to move from the site where the information was generated to
other sites. As a result, users generally have a more accurate and more
detailed model of their needs than manufacturers have, while
manufacturers have a better model of the solution approach in which they
specialize than the user has. (Von hippel, 2005; p.8)

"It is not strictly civilisations that rise and fall but rather the ability of
succeeding generations of people to live according to the inspiring ideals
and laws of that civilisation. Surely the material artifacts are born and
decay; the architecture, viaducts, irrigation schemes or even simple
drinking vessels rise and fall in quality, effectiveness and beauty. The
inspiration and dedication behind that skill, the coherence of that society,
these are the determining factors, and these lay in the permanency of a
canonic understanding . . . Canonic law is based on the objective fact that
events and physical changes which are perpetual are nevertheless
completely governed by intrinsic proportions, periodicities and measures.
(Critchlow, 1981: cover note)

"Interactive TV will transform the way we manage our lives, the way we
work, rest and play, the way businesses market and sell their services. It
will change the rules forever."
http://www.ntl.co.uk/interactive-tv/default.asp (27/10/00)

We are transforming our world at an alarming rate and in so doing, we are


alienating ourselves from it. Our technologically mediated existence is
threatening the very democratic process itself. We need to develop a new
language, a new literacy, in order to both understand our brave new world,
and learn how to live a meaningful existence in it (Dakers 2006, 1).
Dakers J (2006). Introduction: Defining technological literacy, In J. R.
Dakers (Ed.). Defining technological literacy: Towards an epistemological
framework (pp. 1-2). New York, NY: Palgrave Macmillan,
Introduction
The much heralded 'age of network computing' raises some interesting issues for the Cambridge iTV
Trial. As access to and use of the Internet becomes much more widespread than ever before, so the
demand for bigger, better, brighter, faster and more dynamic services will grow. Content and service
providers will want to respond to this demand by enhancing their services which are currently
constrained to what the Internet can deliver.

This consumer and service provider 'pull' will really make the network operators have to take the
subject of additional bandwidth provision very seriously indeed, particularly when coupled with
demand for yet another potentially very bandwidth hungry interactive multimedia application -
advertising. This is a big money business and some of the leading players are ready to commit to
investing significantly in development.

These arguments are not difficult to justify; they are predicated on the simple fact that human nature
dictates that no matter who we are or what business we work in, we always want better and more.

Where does this lead us? To fully interactive multimedia - including interactive TV. Full motion
video, audio and complex graphics are all highly desirable elements of a truly interactive system. But
why have such services not seen wider commercial deployment? You guessed it. The bandwidth cost
argument. The stimulation of the demand for bandwidth on a variety of fronts will give the network
operators a much more solid - and much needed - business case on which to base their network
infrastructure upgrade strategy.

So what next for Acorn Online Media and its partners on the Cambridge interactive TV Trial? Keep
an eye on developments. It's going to be exciting.

Phase Three of the Cambridge iTV Trial: The Network Computer443

In 1994, for example, there were fewer than 3,000 websites online. A recent book
based upon a conference celebrating 50 years of computing and the Golden Jubilee
of the ACM - Association for Computing - features articles from major computing
industry spokespersons. In Beyond Calculation: The Next Fifty Years of Computing
(Denning and Metcalfe, 1997) are articles that focus upon technology,
communication infrastructure, business and human-computer interfaces. What is
remarkable within this volume is the reticence exhibited by authors to cast future

443
http://www.acorn.co.uk/aom/trial/phase3.html (update as of 28th October, 1996). Unlike the
written word, which when published and released to the public domain becomes ‘carved in stone,’
web-based information can be easily changed and updated. Information found at the same web
address can change. This is the case here. The ephemeral and intangible nature of web sites and their
contents pose a problem for studies which intend to include their contents as data. Whereas the so-
called company ‘grey’ literature may count as tangible and veritable evidence for claims made by the
company at some time, this is a much more difficult task with respect to digital and web-based
material. This is not just a problem for academic research as Internet lawyer Anne Branscomb states:

“The ease with which electronic impulses can be manipulated, modified and erased is hostile
to a deliberate legal system that arose in an era of tangible things and relies on documentary
evidence to validate transaction, incriminate miscreants, and affirm contractual relations”
(Branscomb, 1997: p.113)

327
scenarios. Rather the tendency is to speak on current or near future developments, or
very tightly referring to 'sure bet' technological advances with reasonably clear 'path-
dependencies'. Other chapters are revisionist, preferring to comment on the radical
nature of developments in the previous 50 years.

The text which opens this chapter was taken from the Acorn Online Media web page
at the time when the Cambridge Interactive Television Trial was drawing to its end
in October 1996. As such it heralds the end of a particular period in the development
of broadband interactive television in the UK, and indeed the world. At this time
focus was shifting generally from broadband solutions to the Internet as a more
immediate provider of digital services to the home. It also hints at a dignified
compromise with respect to an overly ambitious technological project. The
Cambridge system began with a large-scale vision that comprised of distinctive
technological, institutional and social elements. This was a vision of radical change,
impacting not only the way in which television content could be reconceptualised in
terms of its format and delivery, but also regarding how people would access and use
media and new kinds of online services within their everyday domestic routine.
Principally, it would harness the new creative potentials of digital technology and
networks to create an 'on-demand broadband network'. The technical potentials of
this, with their 'inherent' commercial potentials, were viewed as both enormous and
self-evident. The trial and its partners would spearhead the spawning of new content
and services, they could lay rightful claim to the notion that they were 'cutting edge'.
The trial was viewed as a series of three phases or steps towards a new kind of
interactive mass market. This is a technology era where 'learning by doing' is a
prerequisite.

But the digital era of business is volatile and dynamic. Changes in the landscape of
regulation, standards and competition create exigencies that demand immediate
appraisal, or change in strategy. How to respond to and prioritise change in complex
webs of partner, client, and market demands, threatened plans, and forced directions.
This was so true in the case of the Cambridge Trial.

328
Over the duration of the trial, technical and strategic directions shifted to
accommodate more immediate business prospects in providing, from the system
perspective, 'lower-tech' solutions. These would nevertheless utilise their STB
technology and would let them meet their business plan projections for that year. At
this same time they were faced with an overwhelming succession of costly technical
problems to maintain and address the needs of the 'higher-tech' Cambridge system.
Exasperating these problems were difficulties in recruiting trialists – those who were
to act as surrogate users and consumers on the system.

There then came the promise of a very significant prospect of working with US giant
Oracle to produce the reference designs for the much promoted 'thin client' or
'network computer'. The potentials involved in this offer had the effect of
transforming the Cambridge Trial completely from its original functional aims and
objectives. Originally, these were not to include access to the Internet as a sub-
function of the system. The system that emerged at the end of the trial was one whose
only function was TV-enabled Internet access. In some sense the technological
ambitions of the original vision had regressed towards a much more manageable and
prospectively more consumer-desirable and consumer-possible solution. The main
reason for this is cited in the above quote as cost. In reality it was indeed a much
richer range of factors which I have discussed at length in earlier chapters.

It can hardly be stressed enough that this book represents the product of unique
window of opportunity. An academic researcher gaining access to the rich and
pressurised contexts and processes of a commercially sensitive industrial event. The
Cambridge Interactive Television Trial has come to be widely recognised as a
prototypical example of the meshing of vision with the business, technical, human
and social contingencies of the new media 'age'.

This was a project in which an entire technological system was developed and
implemented for the purposes of learning and commercial interest. A trial is unique
in that it is intended to directly interface technological development with semi-
naturalistic, real world contexts - people, homes, streets, and towns. The system and

329
its characteristics, features and functionalities were appraised in terms of its
operation, usefulness and usability (and other use dimensions) by those who
designed and produced the technology, content, interfaces, and communications
infrastructure, and eventually, those who were held as models of consumer-users -
the trial participants. The success of this experiment remains open to interpretation. 444

In this final chapter, I wish to recapitulate points from the various chapters, drawing
them together to provide explanation of where I view that contexts of use make a
pertinent contribution to theories of technical innovation and innovation of use. I
wish also to make suggestions regarding the pragmatic potential of this in linking
understanding of the use process relative to issues facing firms involved with
producing new innovations. Particularly those such as TV-centric networked
technologies and content destined for domestic locations and use. I place CU in
relation to sociotechnical constituencies. The blending of the two perspectives I
believe offers a way to bridge the gap between 'cultures of production' and 'cultures
of use'. Contributing both to innovation theory, and to laying the grounds for
developing a useful framework through which managers can map the contexts and
environments of use to wider macro-level constraints, opportunities and influences.

Enlarging usability as a research project

One of the original projects of this work was to enlarge upon usability as a desirable
product quality. To do so would attempt to accommodate the vast array of exigencies

contributing to its realisation and evaluation - including the way in which an

individual's experience of usability may change over time.

As indicated in chapter 4, usability engineering and testing as a research practice has


traditionally been a lab-based activity, and as such, relied heavily upon controlled
experimental methods and quantitative means of analysis. It has often combined
questionnaire administration, with system-logging and participant observation in
order to evaluate time to completion of certain given tasks, where manipulation of
444
While the intention of the Cambridge i-Tv being mass technology remained unfulfilled, innovation
of the Om contribution - the STB - continued. The current generation is STB 4.

330
certain interface elements provides an independent variable. The results of the test,
combined with self-reports, and analysis of user behaviour during the test inform
recommendations made to the product developers, regarding interactions on the
original design. Dumas and Redish (1993) reiterate this when they claim that the;
"specific instances that you see in a usability test are more often symptoms of
broader and deeper global problems with both the product and the process." (p.32) In
essence starting with standard usability lab-style tests highlighted within the present
study that while the usability of the stand-alone demo box was 'good', a range of
other issues surfaced in the interactions with subjects. These included issues
regarding their thoughts and feelings about television, its content and the practise of
viewing; their opinions about the content available on the demo box, what they
imagined i-Tv to be and how this compared with what they were apprehending. This
drove a need to approach the main study with a more open-ended style of approach,
which would understand the usability of the system in context with an individual's
perceptions and understandings.

Situating usability as company practice

This brings us to a second strand of development in the originally proposed Ph.D.


study: That of - situating usability as company practice. This would consider
usability testing, and its results as one of a number of competing knowledge flows
within the real world, real time, pressurised product development process. Grudin
(1991) noted that there are many influences which compromise usability
programmes within product development in large organisations. He and others (such
as Norman, 1993: Redish, 1989) suggest that management, 'traditional' development
processes and organisational structure can hinder the incorporation of usability
programmes. Indeed, such hindrance can be an ingrained aspect of company culture
and routines. Redish and Selzer (1985) point out that two sets of costs which they
described as 'test it now' or 'fix it later', may not come under the same budget with
departments (i.e. R&D and customer service) battling it out for allocations. This
causes a rift within the organisation with respect to incorporating usability: "The
manager who must get the manual to the printer on a certain schedule and within a
certain slot is not responsible for whatever havoc the manual might cause later on."

331
(p.51) Internal politics and power most definitely enter into the equation of how
effective usability testing may be, and confuse and make complex any idea of
usability delivering a linear result (i.e. a better and more usable product).

Understanding of such problems has driven the creation of usability departments


independent of R&D and customer service departments. This ensures the place of
usability within the organisation and its product development process, and helps in
maintaining objectivity in the evaluation of a product's functionality. Usability
becomes a discrete company function. However, Wixon and Comstock (1994)
pointed out that testing often involved the setting of goals and tasks by the product
development team, which were then used in tests by the usability team or department
and who would subsequently return the user feedback to product developers.

While this is admirable in maintaining objectivity and scientific rigour in the


application of the research - bearing in mind that developers and designers
conducting tests themselves would often consciously or unconsciously influence
results and their interpretations according to their own subjectified, sometimes
impassioned view of the product. It does, however raise further questions regarding
the successful organisational positioning of a usability study.

In the first instance, questions have been raised with respect to usability testing
slowing down the product development process. Regarding the situated use of a
product within naturalised settings, a number of usability experts (most notably
Wixon et al., 1990), began to consider the wider contexts of use, with particular
regard to work environments. As I have argued in the book, usability (and the other
use parameters) of domestic information and entertainment appliances depend on a
much wider set of social and perceptual conditions than the relation between a
product's functional qualities and human response time in performing set tasks.

Simply because a product merits the attribute of good usability by being tested in a
scientifically rigorous way, this does not mean that it will perform that way in the
marketplace (Wixon and Comstock, 1994). Also, while it may indeed add to the

332
value and accessibility for the most amount of people's individual style and mode of
use, it does not guarantee desirability and usefulness. Conversely, perhaps the least
scientific and rigorous approach for companies deriving knowledge of their product
in relation to perceived consumers - concept testing (i.e. Iuso, 1975) - is open to vast
discrepancies between reported and actual product perceptions and consumer
behaviours (Sands, 1980, Tauber, 1975, 1977, 1981).445

I wish to argue that there exists a rationalistically bounded perspective of the


consumer-user by firms, largely created by the employment of various tools and
research methods which filter and accent aspects of the user and their use process.
Such filtration and accenting corresponds to the routine way in which the firm
conducts its consumer-user research. It can also be in answer to changing corporate
and industrial climates and philosophies where issues such as 'customer focus' and
'continuous improvement' (TQM, Kaizen, etc.) evolve in importance, or in response
to tools, methods and procedures. These may rise and fall in popularity for
enhancing, production, market success, and consumer satisfaction (usability
engineering, QFD etc.)

I am not alone in adopting such a focus. Macroergonomics (Hendrick, 1991, 1995) is


a more recent contribution making a concerted effort to place the ergonomics of
machines and interfaces at the centre of a more holistic, integrated and systemic view
of development. It is a top-down approach to system design based on a
sociotechnical system perspective. Macroergonomics focuses on the optimisation of
work system design through consideration of relevant personnel, technological, and
environmental variables and their interactions. Hendrick (1995) further defines
macroergonomics as the "third generation" of ergonomics, where the first generation
was characterised by human/machine interface technology, and the second
generation by user/interface technology. The goal of macroergonomics is a fully
harmonised work system at both the macro-and micro-ergonomic levels.

445
“Concept tests and product tests do not work . . .The limitations of concept and product tests as
predictors of subsequent sales are borne out by empirical investigations and literature surveys . . .
Furthermore, they are generally of value only in the case of continuous products where consumers are
well aware of standard attributes and functions; for discontinuous innovations they are unlikely even
to predict trial.” (Tauber, 1981: p.182)

333
Again, while macroergonomics may work to place usability as a relevant quality
within the social and technical settings of the workplace, there remains questions
regarding the experiential qualities of usability. This relates much more strongly to
the processes of everyday consumption and integration of technical products into
lives and lifestyles, into the private and personal space of the home. Indeed, one
could argue that one can expect to use [sometimes complicated] machines in the
workplace, whereas the legacy left from domestic machines and technologies – light
switch, vacuum cleaner, electric clock, fridge, television – is one of simplicity in
operation. Silverstone's notion of 'domestication' - which along with the notion of the
everyday practices and consciousness has formed a recurrent theme in this book, and
it suggests the more personalised, possession-style rituals associated with learning to
use and live with ever more complex technologies (PCs, hi-fi separates, home
cinema – all of which require some degree of installation).

Recontextualising

In all cases there exists practices which permit managers, designers and marketers to
augment (and authenticate) their reflexive notions of what is happening outwith the
world of the workplace. A central view of the book is that usability, as a research
project conducted by firms, should be recontextualised into design-producer's
worldviews as a naturalised part of the overall use process.

This would shift its emphasis from being an index of how assessable and usable a
technology is with respect to the intentionality of the designer or producer, to the
experiential qualities of use from the consumer-user's perspective. This entails a
process of understanding the designer-producers intentionality, as much as
understanding the user consumer's interpretation of the product, and therefore of that
intentionality. Design and use is a dialogic process, as is production and
consumption. Under this rubric of 'naturalised' I wish to include the fact that the
product of such studies - the knowledge produced by the research - must locate
within the company at its most useful points, with those who may best use it to
inform strategy and influence design and marketing. Further it must communicate

334
ecologically the data which are relevant in order to be effective in the
implementation of design features, functionality and attributes.

However, what often stands in the way of this, and what was clearly evident in the
case presented in the book, is that the methods of understanding use and users
constitute an identifiable and bounded system of social interaction between the firms
involved (as well as their partners in alliances) and its consumer-users (or trialists).
Arguably this is intermediated by other 'cultures,' such as those comprising du Gay's
et al's 'cultural circuit', but also within 'sub-cultures' at the intra-firm level, say,
between departments and functions. In the case of this study, the PSPs featured
strongly as one of these 'cultures'. The company itself effectively forms a social sub-
culture within the wider context of the parent culture of the society within which the
company operates. They, and their products, also form spheres of negotiation
between various individuals and company cultures, a process which can lead to
contested, negotiated, ineffective, and erroneous reifications of use and the user.

This process leads inevitably to the creation of 'cultural distance' between the users
of a system, and those who would analyse or design it. It is this notion of a cultural
'distance' which provides the motivation for employment of techniques and ideas
from outside disciplines, those particularly concerned with appreciating and
understanding the variations in meaning and concepts between different cultures -
interpretative approaches ethnography, naturalistic inquiry or hermeneutics for
example. The product of consumer-user research has often been validated only by the
criterion that it is implemented - and therefore considered to be ultimately
interpretable - by managers (as Holbrook, 1995 puts it 'managerially relevant'), and
useful to the processes of invoking strategies and policy, making decisions and
reducing risk. But there is a place to inform designers and marketers directly, to
engage them to understand tacit elements of use and usage.

Beginning with the actual consumer-user and their complex of behaviours, attitudes
etc., the first stage of reification is appropriating or conducting research. The process
of selecting research reports, commissioning a study, or setting one up is based on

335
elements such as a client's or manager's brief, reflexive considerations of the
problem, training and knowledge of the field (which includes knowledge of
methodology, tools, previous research and so on). There then follows further stages
of reification, through processes of presentation, distribution, negotiation and re-
interpretation before knowledge of the user and use is embodied in the product
manifesting as particular characteristics, features and functionalities (see below).

Research

Final embodiment of
user image
manifested in
product CAFFs

Product
Development &
Marketing Manufacturing
Individual and negotiated
interpretations of presented
knowledge across company
functions

Design

336
The User
Fig 8.1 Research and its interpretation as provider and filter of knowledge of the user-consumer

Impact of discourse on technology development and perceptions

In chapter 1, I have suggested that anticipation of technologies, and their potentials


extend further than situated and actualised circumstances of use. The discourse that
accompanies technologies can influence perceptions of whether they are a 'good' or a
'bad' thing for society, or even regarding personal use and consumption or even
whether they are worth participating in a trial.446

For instance, the moral panic that suggests that through using computer-based
technologies to communicate we mitigate our need to have person-to-person
interactions, can carry weight in the shaping of markets. 447 As TV-centric
technologies permit on-line shopping and ordering of goods the less incentive there
is to go to the shops or to the video store to rent videos. Television-using time is
displaced with games console playing time. The amount of people that it is possible
to 'meet' online rises, while the quality of the interactions are often diminished or
taken out of their human context. Sobchack (1996) suggests the intertextuality of
expectations between domestic technologies. In a world socialised into timeshifting
and speed of process, our expectations rise with respect to performance, transferring
such expectations onto other technologies - she offers an example of the child who
446
Phase two trial participants were to comprise employees of the technology partners. However, the
‘rumour’ that spread regarding the lack of the Cambridge system to deliver useful content, was
understood to have played a significant role in the subsequent lack of interest shown by consortium
members to participate in the trial.
447
Indeed I was consulted during a visit to Om regarding the ‘video violence’ debate in its potential
spill over to i-Tv.

337
suggests that her mother 'fast-forwards' the 'slow' microwave.

In my own pilot study there seemed to be consensus in the more 'middle-class'


households regarding that 'use' of printed material provided a superior, more
'reputable' source of entertainment and information over the use of television. This
was echoed in one of the case studies (case study 5) where the parents had strict
household codes with respect to television use. How much is such an attitude
influenced by the recurrent theme of 'video violence' which appears regularly in the
headlines of newspapers? How much are they reinforced by the formidable support
from research that claims that the viewing of violent material propagates violent
feelings and behaviour in individuals? How does such discourse and attitudes impact
diffusion? In which way do they, and in which way can they, restrict and otherwise
shape consumption and use? These questions, while outside the scope of the present
study, remain of interest to the notion of how discourse shapes perception of
television and its programming.

In the first two chapters, I made the suggestion that many of the new and emerging
digital technologies, such as those whose primary feature is making television
interactive, suffer from a kind of 'interpretive flexibility' or 'descriptive confusion'. 448
Coming under the auspices of interactive television, enhanced television, augmented
television and various other rubrics, digital technologies intended to bring online
interactive services to the home have different meanings to different people - they are
polysemic - their interpretation depends on the interests, intentions, and knowledge
of their user-consumer-audiences.

Interaction while capable of being decomposed and described in both technical


and content terms, nevertheless does not necessarily relate to the user
experience. It is as polysemic (perhaps to an even more pronounced degree)
than content material taken as an individual component of an overall
experience of use. Interaction, with 'interactive' media, like social 'interaction',
448
‘Descriptive confusion’ was originally identified as an issue which Bricken (1991: p.4) applied to
virtual reality (VR):
“VR is seeking definition, it could be anything from email to a fully surrounding, multi-
sensory environment. We are struggling with appropriate comparisons.”

338
is for the large part tacit compared to the experience or outcome of searching
for, finding and viewing content. It is rather like the 'journey' to some
'destination'. Someone may ask you how your trip was, but it is unlikely that
they will ask how the steering on the car handled, unless they had some vested
interest - such as the car being their own. But this is the case in design
evaluation where one seeks out what may appear sometimes very subtle details
regarding the use and user experience. Attempts are made to unpack what is, in
effect a 'whole' experience which may include characteristics of the interface,
the content information, graphical design, ease of use, the brand associations
with advertised goods, delivery and fulfilment, and cost. All combine to deliver
value or usefulness.

User innovation

This public domain comprises of not only those who will consume and use
technology but also those who will innovate upon them and innovate upon their use.
Coming from the social constructivist perspective on technology development,
Westrum (1991: p.238) suggests that: "A device may appear to be something
different to different groups interested in exploiting it . . . it may evolve in quite
different directions." Further, as understood by the cultural circuit, actor-networks
and sociotechnical constituencies, the 'public domain' also includes those who will
regulate technologies as well as create legislation regarding their production and use.
To capture how users and others may reinterpret use and meaning in technologies
can have significant implications for designers and producers (Von Hippel, 1990,
1996; Fleck, 1994). Also, most importantly, the public includes creators and
producers of other, often related technologies which create the conditions and
environments (such as competition, cheaper network access, and so on) whereby the
new technologies flourish. All come to experience the product, interact with it, in
different ways within different frames of reference.

As much as usability exists only as a single element within the human


comprehension and experience of technology (along with situational circumstances
of use, the development of usage patterns, and development of value and usefulness),
its importance and meaning to the firm only finds place or makes sense within

339
broader business, social, institutional, ideological and cultural frameworks. It is
reasonably safe to say that the rise in the awareness of usability and user-centred
design as a contributing factor in the success of products and services, parallels the
movements in management practice towards more customer-orientated approaches
(Wiklund, 1994). Under such management climates any tool or set of techniques that
promise greater customer satisfaction may be incorporated into product/service
development processes (for instance, Walsh et al. 1997).

However, this intersubjective creation of the consumer-user by a firm and its


partners, distilled through research and presentation processes, is only an element
which informs design and marketing of a product as only part of complex. Design is
never free as it is always constrained by influences beyond the tools, knowledge and
thinking that enable it. It must always conform and meet, but yet must also extend. It
is a fundamental and constant human activity, entailing far more than has been
suggested by limited definitions of design as the optimal use of available resources or
as some sort of index of aesthetic merit. Cooper and Press (1995) suggest that the
contemporary nature of design as a process and practice is indeed ambiguous:
"Design can be conceived from being an individual activity such as designing
a chair, through to a corporate planning process that regulates innovation to
meet market demands." (p.42)

So one may design technologies, one may design an organisational structure,


services, research, an interface, or even design for experience.449 Each of these design
facets and challenges are to be identified within the Cambridge Trial. It has to be
finally combined with other influences into the design process (such as costs,
recognition of standards and so on). I shall return to this aspect in greater detail later
where I suggest the relation of CU to wider constituencies of social and technical
influences. Here I wish to suggest that it is the mode and inquiry and the mode of

449
More than the obvious design of the physical and graphical interfaces, the architectural layout of
the offices, desks and meeting rooms of the new Online Media (Om – the firm whose serves as central
focus in the case study presented later) HQ was designated by the Acorn Chief Scientist. Between her
‘usual’ work in advanced chip set design for the first and second set top boxes (which pre-empted all
other development work), she occupied herself in designing the layout of space and personnel. Om
also laid down the organisational design and structure for the entire trial, even though as a consortium
activity ‘ownership’ of the trial was often dependent on which partner one asked (for instance
Cambridge Cable publicity suggested the trial as their own). Also most noticeable on visits to the
building was the succession of make-shift charts adorning the walls.

340
explanation of the basic research which defines and colours the initial 'virtualisation'
of the consumer-user in the process (above and beyond the initial reflexive notions
held by the product instigators).

Traditionally, as in large-scale market surveys this takes the form of aggregation


statistics of some sort, providing some insight into 'generic' consumer-user
preferences. This data may be combined with psychographics or other
complementary studies which lead to some notion of a 'preferred reading' of the
consumer-user favourable or perhaps dismissive of some suggested strategic move
regarding new product development (see below).
Table 8.1 Data, information and knowledge regarding the use process
use usability usage usefulness
Subjective/ Can be fairly objective Objective Objective/ Subjective
objective subjective
Subjective Motivation or attraction Experience of Time to complete Value outside of
to use, circumstances use task, satisfy immediate use,
and location of use, curiosity, or gratification,
service/s chosen achieve
entertainment
satisfaction
Objective Who, where, what, and How easy to What, when, and What, how much,
why use/understand how much and how long
Research System tracking Usability System tracking, Self-report,
methods participant observation, inspection participant possibly
self-report through methods, observation, self- triangulated with
quest. survey, usability testing, report through use and usage data
interview, diary etc., beta testing questionnaires/
methods, etc. interviews
Spatio/ In shop, in home, at Immediate term Patterns of use Period of valued
temporal work, in school short term time spent on use;
long term tasks/programme immediate term
short term
long term

Relation to extra-
use activities

341
Attribute, Demo-psycho graphics Ease of Which service Value of services to
Phenomena of user, navigation, watched, for how individual, the
or task individual differences short cuts, long, level of sustenance of value,
of user, genre of ergonomics of interaction, perceived from
service, style of physical 'zapping' between actual value,
presentation, awareness interface, screen services, which symbolic value.
of product and its and menu layout times watched,
technological contexts, regularity of
knowledge of point of programme/
sale/ service use

Demonstration
location
Characteristics Individual Design and Individuals position How it compares
characteristics, characteristics in the household with perceived
Technology itself of the hardware options
(interface, delivery and software
platform, system)place interface,
and context of use experiential
characteristics
of the system
Domain Psycho-sociological, Techno- Psycho-sociological Cultural, psycho-
psycho-economical, psychological economical
geographical,
psychological

As suggested from the introduction of this book onwards, the incorporation of


interpretative paradigm research, coupled to enhanced means by which to extract
behavioural data from consumer transactions with systems, alter the previous
horizons through which designers, marketers and mangers may authenticate their
perceptions of their product, its use and its consumer-users. It opens previously
constrained spaces of use knowledge (i.e. domestic activities).

The product and marketing reaches into the public domain in general and specific
ways, dependent on distribution and advertising, press reports and so on. Diffusion of
the product into people's lives,
Analysis of and
data subsequent research begins the process again for
the innovation of other products (fig 2 below). Production,
presentation or
distribution of report
Research of user consumer

Negotiations of implicit image of


user-consumer in report with
Actual user consumer projections and intuitions of
designer, manager, partners, etc.

Appropriation,
consumption and use Integration and
implementation of user
information in
Distribution mechanisms conjunction with other
for product, advertising constraints and 342
and other forms of public influences on design
knowledge of product and marketing
Fig. 8.2 A circular model of user knowledge propagation, communication and implementation

The procedure of the above model has been broached by a number of industrial
changes, most notably the incorporation of user-consumer-product 'tools' methods or
techniques, developed in answer to the call for concurrent modes of manufacturing,
higher degrees of agility and faster times to market for products, advertising as point
of sale in digital interfaces, and one-to-one manufacturing or mass customisation.450

Within the book one of the main points which I have drawn attention to is that the
above circular model is threatened by a number of changes. These include:
 Changes in the ontology of audiences, users and consumers.
 Changes in consumer tastes and rational regarding goods and
services.
 Changes in the transactional technologies linking providers of goods,
media content and services to users-consumers.
 Wider changes to the business environment resulting from the direct
or indirect digitisation of business and economics.

As with other cultural products and their creation, dependent on wider societal shifts
in emphasis and interest, research methods in disciplines such as psychology have
evolved to ask new questions, asked and answered in new ways." (Smith et al., 1995:
450
An agile company has been defined by Goldman et al. (1995) as one that is capable of operating
profitably in a competitive environment of continually, and unpredictably, changing customer
opportunities.

343
p.1) Also within the business world, analysts are increasingly dissatisfied with using
simple aggregated market statistics that reveal little about the underlying
sociocultural dynamics that affect acceptance of advanced technical systems (i.e.
Gonzalez, 1997).

However, as explored earlier, according to the objectivist, neo-positivist or


hypothetico-deductive view, authentic knowledge is acquired when the "subjective" -
emotional, aesthetic, moral, and religious - elements in knowing are strictly and
rigorously eliminated. Such elements, it is held, taint or distort knowing and the
derivative knowledge by introducing elements of ambiguity and commitment. A
large part of the project in what has come to be regarded as the 'dominant' view in
research method has been to eliminate extraneous influences from the object of the
research other than those introduced or known by the researcher. If not isolated, these
'extra-scientific', subjective elements would determine the foundational paradigm
upon which science is erected and leave science in the realm of opinion and
'rationalised superstition' (Gergan, 1973).

In the standard view, it becomes the purpose of a rigorous scientific method to insure
that knowledge is obtained by 'objective' and 'verifiable' means. The ultimate object
is attaining knowledge of the particular features or functions of the object of study,
and to infer causes and predict effects. In the grading of levels of knowledge
according to their degree of objectivity, knowledge is most certain that which has the
smallest amount of personal contamination. Mathematics and physics would be pure
sciences in this view - while knowledge is most problematic where the personal
element is greatest - arts, literature, philosophy, as well as in those 'complex' sciences
such as biology or psychology, and moreover in the social sciences. These, however
are also the domains often most associated with the propagation of culture, and shape
the values and meanings people have for things in the world.

With its roots in the natural sciences, where properties are deemed to lend
themselves more to measurement and control, positivist approaches have been
exposed to a number of critiques (i.e. in psychology in the early 70s with Gergen,

344
1973; Harré and Secord, 1972; and Shotter, 1975). Commentators manifested
discontent with positivist method, and the exclusive dominance of quantification.
Harré and Secord (1972) for instance were concerned with the mechanistic model of
human beings which academic psychology seemed to subscribe to. These were
derived, as was suggested in chapter 4, from the rise of behaviourism as the regnant
force in psychology. 451 The popularity of behaviourism and 'big project' sociology,
and other disciplines such as anthropometrics and ergonomics (described in chapter
3), co-evolved with the massification of industry, the mass media, and urbanisation.
It provided answers to a society increasingly reliant on science to solve problems
derived from vast changes in lifestyle and living conditions. But commentators on
consumption (e.g. Firat et al., 1995) view reversal in the roles of production and
consumption as a distinguishing feature of recent society. Production loses its
privileged status in culture to consumption, which has become the means by which
individuals define their self-images for themselves as well as others. This clearly
illustrates the sovereignty of the consumer in processes of selection, use and re-use,
which will impact directly or indirectly upon production.

But the design of technologies are often optimised in various ways according to
intended or 'aimed use' within the specific environments of factory, workshop, office,
kitchen, living room, garden etc. They form an integral part of the multiple flows and
networks of activities and energies which constitute and sustain the space we call
'home' and most importantly, its relevance to everyday life and activity. But also in
the case of technologies, and as was the example in the story related by Akiko Busch
in Geography of Home, it is often only when technologies breakdown that they, their
use, and their function, comes to conscious awareness and focus. 452 Then, and often
only then, do we realise their relevance and integral nature in sustaining the smooth
451
The rise of behaviourist thinking has been linked socio-culturally to the urbanisation and
industrialisation of America. Bakan (1966) sees that these social developments created motivations
towards mastery of the ‘incomprehensible and worrisome strangers all around us’ - exactly what the
scientific claims of behaviourism promised to help us do. Indeed studies and literature promoting
‘new paradigm’ research have often appeared critical and defensive when confronting the
‘conservative’ and ‘domineering’ stance of quantitative ‘hegemony’. Quantitative method is often
viewed as the ‘received view’ Collican (1990).
452
I have already mentioned Winograd and Flores (1986) citing of Heidegger’s when he speaks of
breakdown to represent this phenomena. This is a bringing to consciousness awareness things, objects
and thoughts through their 'lack of fit' or 'lack of flow' in the course of purposeful use is an important
concept to the functioning, as well as evaluation of technologies.

345
running of domestic affairs. Most relevant to the story is the way in which addressing
breakdown led to reconceptualisation - a taking stock of the contexts or the 'bigger
picture'. Busch reflects on her tale of her faulty septic tank pump thus:
"While I reject categorically the idea that my house (or any other) has a
personality, I have come to realize that it does have a language of its own,
one that includes not only the slight sounds, hums, and vibrations of all
electrical appliances that keep it going, but a host of other interior systems, a
network of social and cultural currents, those habits, beliefs, and values that
also make it function. And I realize, too, that it is by being attuned to all these
systems that we might arrive at some genuine understanding of what it is that
gives us power to the places we live." (Busch, 1999: p.163)

She alludes here to the home as a seething, living complex of elements and dynamic
flows, as an exchange of energies and industry. As such she hints at how the home,
the domestic, its contents, activities and technologies represent a whole, a dynamic
open system, and a special case for study. While her example hints at an animistic
view of the home, the home considered as a living entity, she does not promote a
blurring between human and non-human elements (as is promoted in actor-
networks). Instead she shows how they fuse. Mesh in a meaningful way, which lead
to an enriched view of the environment in where she lived. But Henri Lefebvre
(1992) warns against taking such kinds of metaphors too seriously lest one abstracts
too much from properly comprehending the lived experience of things:
"Consider a house, and a street, for example. The house has six storeys and
an air of stability about it. One might see it as the epitome of immovability,
with its concrete and its stark, cold and rigid outlines . . . Now a critical
analysis would doubtless destroy the appearance of solidity of this house,
stripping it, as it were, of its concrete walls, which are glorified screens, and
uncovering a very different picture . . . permeated from every direction by
streams of energy which run in and out of it by every imaginable route: water,
gas, electricity, telephone lines, radio and television signals, and so on. Its
range of immobility would then be replaced by an image of a complex of
mobilities, a nexus of in and out conduits . . . the occupants of the house
perceive, receive and manipulate the energies which house itself consumes on
a massive scale. Comparable observations, of course, might be made apropos
of the whole street, a network of ducts constituting a structure, having a
global form, fulfilling functions, and so on . . . The error – or illusion –
generated here consists in the fact that, when social space is placed beyond
our range of vision in this way, its practical character vanishes and it is
transformed in philosophical fashion into a kind of absolute. In face of this
fetishized abstraction, 'users' spontaneously turn themselves, their presence,
their 'lived experience' and their bodies into abstractions too." (pp.92-93)

346
As a tools of analysis, sociotechnical constituencies and actor-network theory
attempt to protray why a technology 'is as it is' at a certain period within its
development cycle, how it reached this stage, and how it relies on some
understanding of where its origins lie, from both its technological and social contexts
and origins. But they too speak in metaphors, as do those who speak of the
acceptance and domestication of products. But as Ulrich Neisser indicated in
Cognition and reality (1976) we do not perceive things in themselves – i.e. pure
stimulus - but the meaning things have for us. Meanings are attributed to things by a
range of individuals and institutions, and so they are conditioned by a complex of
influences including the symbolic attributes of the product group (i.e. the status
attributed to certain models of cars or watches). These symbolic elements are
afforded sense through the propagation of myths - utopian or dystopian discourses,
marketing and PR, and moral panics (via marketing and/or press reports). These
always accompany products in their diffusion through to consumption and use.
Further, there are the tensions and resolutions regarding how these are situated within
the pre-existing networks of social and technical relations and use functions within
the household, as well as lives, lifestyles and everyday practices of potential
consumer-users.

Mechanistic models of people

“Trials can take on something of a life of their own, losing their attributes of
being a set of highly controlled conditions, becoming instead a preoccupation
and severe drain on resources. The paper suggests that problems may begin
early in projects, where there is a lack of clear-cut knowledge objectives and
where the reification of 'users' represents little more than collateral for
winning support for projects. Later, 'users' come to be viewed as intelligent
parts of the system. Here they are 'designed' to provide various kinds of
feedback regarding technology, interface and service characteristics. This
suggests a narrow, unnatural and flawed view of use and users, detracting
from the key relevancy they play in providing learning on trials.” (Nicoll,
2000)

Mechanistic models of people - as individuals or groups - arose with the


enlightenment and the development of neo-positivist ways of conceiving of the
human and the social. It culminated in the rise of mass systems of media, and

347
production fuelled notions of 'the' subject, 'the' user, 'the' consumer, and 'the'
audience. There are dubious epistemological assumptions implicit in these maxims -
there is a lot more to the story than 'coming to know X', where X is a generalised,
stereotyped user, consumer or member of the audience. They are often taken to be
'cybernetic' beings where particular inputs will produce certain outcomes. Such
images of the user or customer base have long been popular in HCI, usability,
television ratings, and the selling and buying of goods and services. It is generally
assumed that knowing the user, audience or consumer consists in developing
familiarity with the prospective consumer-user's task domain. This is achieved
through modelling it in some way, providing the designer, programme scheduler and
marketer with a handy (usually quantitative or measured) characterisation from
which requirements can be derived and needs identified.

New schools of practice and thought are adding new dimensions to this idea: one of
these is participatory design - otherwise known as the 'Scandinavian Approach'.
Although this method is considered to have drawbacks resulting from the
introduction of users into the design process - the added complication of extra human
relations to manage, possibly taxing the social skills of the design team; the risk of
'group-think', and so forth - the benefits which can accrue as a result of more closely
integrating the users and their knowledge of their environment are such that the
approach has increasing legislative force supporting its use. One crtic is

Under the name of the widely spread “participatory design” method, I have
seen inadequate objects designed with users, or by following users’ wishes
too far. Participatory design when used by a sensitive and experienced
designer can be a useful help; in the hands of an inexperienced designer it can
provide the mistaken feeling of being right, even when one is not.
Participatory design is viewed by many as the Ferrari of the user focused
design research methods. A Ferrari, however, would not help a blind person
drive. It will not even help an ordinary driver. The priorities of a project
involve choices, choices are judgment calls, and judgment involves assigning
value to information, an operation that is too complex and varied to be
definable by a series of steps. (Frascara and Winkler, 2008, P. 11)453
453
Jorge Frascara and Dietmar Winkler
On Design Research
Table of Contents:
Article:
1 Jorge Frascara and Dietmar Winkler on

348
As far as techniques in HCI have been concerned, this is far from the end of the
story. In her evolving map of design research methods, Sanders (2002, 2006,
2008)454 represents the range of attitudes towards human-oriented design, from the
expert mindset and the participatory mindset, in both research-led and design-led
inquiries. In the more traditional expert approach, such as in human factors (also
called ergonomics), she says design researchers see themselves as experts and they
see the people they are researching (and designing for) as “subjects,” “users,”
“consumers,” etc. Whereas the Scandinavian inspired participatory approach sees
design researchers collaborating with the people, who are being served by design, as
co-creators in the process (2008: pp.13-15).

Other human-centered research areas that fit into her model include “design and
emotion” that investigates people’s emotional interactions with products (Design &
Emotion, 2009; Green & Jordan, 2002) 455 and “experience design” that focuses on the
relationship between people and their experiences with products, services, events,
and environments (User Experience, 2009; Woo,2007: 3-4) 456. Brenda Laurel’s book,
Design Research: Methods and Perspectives (2003) is one of many that present
human-oriented methods adapted “from the applied social and behavioral sciences
and/or from engineering: human factors and ergonomics, applied ethnography, and
usability testing” (Holzblatt & Beyer,1998; Laurel, 2003; Sanders, 2008: p.14)457.

Design Research Design Research Quarterly 3:3 Jul. 2008


454
Sanders, E. (2006). Design Research in 2006. Design Research Quarterly, 1(1), 1-8.
Sanders, E., B.-N. (2002). From User-Centered to Participatory Approaches. In J. Frascara (Ed.),
Design and the Social Sciences: making connections (pp. 1-8). London and New York: Taylor
& Francis.
Sanders, L. (2008). An Evolving Map of Design Practice and Design Research. Interactions, 15(6),
13-17.
455
Design & Emotion, S. (2009). Knowledge Base., source:
http://www.designandemotion.org/society/knowledge_base/ Retrieved June 25, 2009
Green, W., & Jordan, P. W. (2002). Pleasure with Products: Beyond Usability. London and New
York:
Taylor and Francis.
456
User Experience, N. (2009). Retrieved June 25, 2009, from http://uxnet.org/
Woo, H. R. (2007). A Holistic Experiential Approach to Design Innovation. Paper presented at the
International Association of Societies of Design Research Retrieved June 18, 2009, from
http://www.sd.polyu.edu.hk/iasdr/proceeding/html/sch_day2.htm
457
Holzblatt, K., & Beyer, H. (1998). Contextual Design. San Francisco: Morgan Kaufman Publishers,

349
'Creating the user' considers how the notion and role of 'user', 'consumer', viewer,
subscriber, subject, customer collapse in the light of new media development. The
user, within the course of initial product development has been for the large part little
more than an 'unknown other', someone who will take into their lives and home a
technology which they will find benefit and value in using. Market success used to
be the main illustration of demand for certain product classes. However, in the more
post-modern marketplace much more radical innovations have perplexed quality
processes such as quality function deployment (QFD) in their efforts to align the
'voice of the customer' with the 'ear of the engineer'.

Quite clearly, images of the 'research subject', the 'user' and their use process,
and the 'consumer' may be constructed through the projections of researchers,
designers and managers, where they attribute behaviours and attitudes onto
their users based on their own lifeworld experiences.

They may also be based on more formalised renditions of how people are based on
scientific, marketing, usability and consumer research. With respect to research
Poole and McPhee (1985) see that;
" . . . it is easy for researchers to impose their own constructs and models on
subjects, substituting observers' insights for actors' processes and
understandings. The substitution often occurs unaware, because researchers
take social scientific constructs for granted and do not consider that they only
reflect professional discourse and not the reality of subjects . . . for example is
the way we conceptualise relational control consistent with how subjects see
control issues? Are the statements we call dominant actually seen as such by
subjects? Are subjects even concerned with control in day-to-day
interaction?" (p.130)

This draws attention to another point of focus regarding the circular model of user-
knowledge-implementation referred to earlier. The intersubjective creation of the
consumer-user by a firm and its partners, distilled through research and presentation
processes, is only a single element informing design and marketing of a product. It
has to be finally combined with other influences into the design process (such as

Inc

350
costs, recognition of standards and so on). I shall return to this aspect in greater detail
later. Here I wish to suggest that it is the mode and inquiry and the mode of
explanation of the basic research which defines and colours the initial 'virtualisation'
of the consumer-user.

People's lives, lifestyles, language and system-logging

Language lies at the very heart of interpretivism - interpretation is a quintessentially


linguistic phenomenon. Interpretivism, as contrasted with rationalism, may be
characterised by an awareness that nothing comes to us an as absolute 'given'. 458
Understanding and awareness occur against a much wider background. Such
processes invariably comprise of a gamut of communication techniques and styles
and symbolic manipulation. The casual chats, the more targeted and practised sales
pitch, the various marketing and consumer research practices each are acts of
communication which develop learning and knowledge. Rogers (1983) identified
diffusion through the process by which: " . . . an innovation is communicated through
certain channels over time among the members of a social system." (p. 5) John Law,
Michele Callon (i.e. Callon, 1980, 1986; Bijker and Law, 1992; Latour, 1987), and
others such as Doheny-Farina (1992) have also drawn distinct attention to the way in
which rhetoric plays an integral part in technology development. There is a particular
power in this proposition when one considers the instance of 'high-tech' products
such as consumer electronics. Here, there is often heavy reliance upon pronounced
claims of improved or superior technical performance and/or specifications. 459 The
high-tech digital age builds upon a manifest desire on behalf of both industry and
consumers for things to be made smaller, more powerful, faster, louder etc., and for
these specifications and features to be made to be public knowledge (i.e. publicising

458
In as much as rationalism is taken to imply the notion of a priori truths. It could be claimed that the
obvious "contrast" to such a stance is (by definition) empiricism.
459
Consider the placards and stickers upon devices on retail shop shelves. There is often an exhaustive
list of features and specifications labelled on a device. A particular example is the configuration of
PCs. It is widely held that the ever more impressive graphics features of computer games are largely
responsible for driving the development, and a matching consumer need, for the enhanced video
features of graphics cards. Adding enhanced components to a system can often highlight a more
general core weaknesses in the overall system’s performance. In this example, this can lay the grounds
for the need of other enhanced components, or indeed an entirely new machine.

351
specifications for the purposes of attracting buyers, or conspicuous exposure of high-
tech possessions).460

For the large part, drawing upon thinking derived from social studies of technology I
propose 'interaction' to be a point of convergence where different sets of exigencies
converge and exchange. Between various social groups, between the needs,
requirements and goals of individuals, and the characteristics, attributes, features,
and functionalities of products.461 For instance how close is the fit between something
that is created to be useful but also for profit? What is the difference in the industry
of design and the industry of use? Something that is found to be useful but also
valued?

Some social perspectives take the position that technologies in general are "neutral",
which means basically that they are seen as "value content free" (Feenberg, 1991:
p.6) neither good or bad in themselves but which may be used well or badly
depending upon who controls them. Thus, according to Arnold Pacey, who supports
this idea, when technology fails or when it has negative consequences, the cause is
not the technology but the improper use of it by: "politicians, the military, big
business, and others." (Pacey, 1992: p.2) The relevance of such positions may have
validity upon policy-making levels, but they tend to act towards a neglect of
technological impacts, and more importantly the co-shaping that constitutes
successful diffusion – something which is not institutional, but ultimately individual
(Miller, 1995).

For instance there are no 'general-purpose' tools. Technologies have specific features

460
However this is also a time when products are arriving in the high street unfinished. Whereas it has
become common practise to realise beta versions of software commercially, using consumers to
indicate via help-desks flaws in the software version – hardware devices are now appearing for sale
which show functional discrepancies with the claims made in the sales literature.
461
I take for the large part the problem proposed by Marzini (1990: p.70); and quoted in Cava and
Svanfeldt, 1992: p.308) when he speaks of the problem of existing market research to identify issues
of new products:
“A market study . . . can only photograph reality, bringing forward what is, in a
way, already obvious. It cannot show the detail, the meeting point between what the
general public might want (but has yet to find a way of expressing it) and what the
producers might offer (but have not yet found an expressional support for) and what
constitutes the idea of new product.”

352
which do not pre-ordain, but certainly predicate, certain uses. Technologies are
intrinsically biased towards characteristics, features, and functions. Technology
interacts with other factors (for instance economics, political views, etc.) in a system,
shaping attributes and being shaped by them. Technologies may have influences on
us at macro-, meso- and microsocial levels.

This feature of interpretivism may well account for the popularity of continental
existential philosophy in Information Systems literature and research, in particular
that of Martin Heidegger (who in turn, tutored Hans-Georg Gadamer, often
considered the father of modern Hermeneutics). Wittgenstein in turn could perhaps
be characterised as interested in the nature of that background; the nature of the rules
and conventions which allow us to interact with one another, and the world about us.

Commentators on science method such as Polanyi (1958) argued that there is an


inescapable and essential personal element that is a structural component of all
knowledge whether the case be physics, biology, medicine, painting, or poetry. This
is the essence of phenomenological method where the emphasis is upon
understanding the 'building blocks' by which the individual constructs meaning. The
social dimension of knowing is retained in our references to the "scientific
community" or "academic community." Essentially, knowledge is thus not private
but social. Socially conveyed knowledge blends with the experience of reality of the
individual. This is an important issue regarding the development of use skills by
individuals, as well as how technologies integrate into everyday life (domestication).

However, as indicated in chapter 4, interpretative research is not an answer in itself


to understanding the nature and behaviours of the interactive user-consumer. Polanyi
(1966) demonstrated we can know more than we can say. This is a problem for those
whose research approaches rely on eliciting responses form people regarding tacit
behaviours and routines. For instance, Ehn (1988); Bullen and Bennett, (1990); and
more recently Grudin, (1994) with respect to CSCW systems, show that tacit
knowledge continues to play a disturbingly large role in many of the problems
designers struggle with. In my own user research I noted that some people are more

353
able to articulate their activities than others, or are perhaps more vocal and
opinionated regarding what media, and in particular, what television means to them.
Some people seem naturally more introspective and/or articulate regarding their
thoughts, beliefs and opinions on things. This too can shape and direct responses.

This places an emphasis on awareness, tools, methods and procedures which


can balance feedback from all types of individuals, so as not to predicate the
design and advertising of systems towards those more vocal, or able to articulate
and express their beliefs and experiences.

In the case studies of trial participants, one household (appendix 1: Case Study 2)
could be tagged with being non-articulate regarding their impressions and thoughts
of the i-Tv system. However, on closer analysis of pre-existing use and usage of
media technologies (such as timeshifting using the VCR) they were most definitely
among the more sophisticated users. This may be compared with another household
(appendix 1: Case study 5) which could be labelled more articulate and expressive,
whilst remaining among those households who exercised strict rules and regulations
regarding media use and consumption (i.e. electronic media was rated as inferior to
printed media, and the childrens' viewing was precisely monitored and restricted).

This relationship between interpretivist practices, language, and more contemporary


phenomenology is also worthy of note. Phenomenology is, ostensibly at least,
concerned with the experiences which individuals have of the world about them, and
language has found an increasingly important role in understanding the connections
between the embodied individual, social interaction, and the wider experiential
world.462

Tacit behaviours are of relevance and of interest to marketers and in chapter 5, I


drew attention to the potentials that interactive media have in tracking user-

462
Particularly in the work of Maurice Merleau-Ponty. It would also be interesting to explore the
connections between his rejection of the idea of the pure / private consciousness (Blackburn) and view
of the role of society and language, Wittgenstein's anti-private language argument etc., Maturana's
biological explanation of linguistic phenomena, and the recurring notion in Information Systems work
(see Winograd and Flores, 1987).

354
consumer's actual (as opposed to realised and perceived) usage patterns. While the
quantitative analysis of audience viewing patterns (the 'ratings') - based on
projections of aggregated audience viewing behaviours - have long reigned as the
essential tools of broadcast companies, media buyers and advertisers in their courting
of clients, they have been criticised for presenting poor representations of 'true'
audience activities (i.e. Ang, 1991). In my own user research it was clear that people
were using the television for different purposes including baby sitting, recording
programmes for viewing later, or for playing video games. However, digital systems
have an inherent by-product of their functioning in that they can register every
control signal produced in their use. Such system-logging (sys-log) data can be
associated with a particular machine (STB or computer), or with the provision of an
access code a particular individual.

However, as the case study of the Cambridge Trial indicated, the preference for
technical and numerate means of understanding use and user, led to a plethora of
problems, social, organisational and technical in nature regarding the production of
appropriate and usable knowledge. On a more general level, the essential problems
facing those interested in trawling the large volumes of data produced by electronic
use logging data is precisely that which challenged the veracity of the television
audience ratings (Ang, 1990).

What can truly be inferred by number crunching and aggregation statistics


alone, when acts of consumption and participation in entertainment activities
are quintessentially meaning-making and highly qualitatively based
experiences? These are experiences often participated in for intrinsic motivation
rather than for rationally economic outcomes.

In between, the 'hard' neo-positivist research approach of lab based usability testing,
and the 'soft' qualitative research of concept testing lies trials. Here trial participants
- acting as surrogate consumer-users - develop anticipations and visions of the
product based on the concrete features and functionalities of actual systems.
Combinations of research approaches, such as was attempted by the user/marketing

355
research group of the service nursery can produce valuable insights which can fuel
technology and service design and production, as well as inform the best ways in
which these technologies and services can be marketed and packaged. A number of
marketing innovations including the distribution of demos (such as the success of
shareware versions of games like Doom in selling the full versions) and beta testing,
illustrate the interest of firms in the naturalised siting and testing of their products.
Experience of, and living with, a product is essential to either seeding markets for
improved expanded versions, or other associated spin-off.

This is in keeping with a number of movements now being understood and adopted
by the large corporations with respect to a more holistic view of how people are with
technologies they already have integrated into their every lives, and how they come
to confront and cope with new ones.

In engineering there has been the attempt to bring to focus some of the more
ephemeral, less tangible qualities of a product such as discerning the 'voice of the
consumer' in QFD. Here, quite diffuse, subjective and qualitative aspects of a given
product are supposedly infused into the design and manufacturing process through a
ranking system. This converts them into quantitative measures suitable for choosing
materials, setting machine parameters and so forth. However, even here one can see
that the qualitative voice of the consumer-user only functions as an initial and
relatively minor influence within the processes of conversion. Emphasis of the
deployment concentrates on the conversion of design parameters to engineering
parameters, and then subsequently to manufacturing and production parameters.
Also, the capture of the subjective data may be flawed:
"[QFD] . . . has to be used in a careful and methodical way. You cannot just
let the customer say the window handle in a car should be easy to wind. You
have to find out just what that means, in the number of turns, the stiffness of
the handle, the reach. You construct carefully a matrix of the characteristics
of a product and mesh that with a analysis of the customers needs and keep
turning the data, like a prism, seeking new flashes of insight in to what the
customer wants." (Main, 1994: p.96)

This is an important point that suggests the well-cited software industry adage –
GIGO – garbage in- garbage out. If one begins a deployment with badly articulated

356
or sampled user needs and requirements then it will lead to a poor product. An
obvious instance where this may be caused by the product, rather than the research
method, is in the case of new types of product (like i-Tv). The lack of being able to
reference new kinds of characteristics, attributes, features and functionalists to
existing products may provide useless data. Main (1994) points out that QFD can
contribute to incremental improvements in products, but it has not been linked to
breakthroughs in new products.

Summary

Much of the recent work in consumer research, studies of the audience, or in


contextual HCI work complements and fills a gap when considered aside the
emerging capabilities of digital networks and technologies to provide complete
registration of their use. But it is important to consider two main interconnected
points relevant to an overall – or whole - picture consumer-user research:
1. The sphere of research methods is constantly evolving.
2. As a pretext and also as a result of 1., the scope, depth and sphere of
imaging uses, consumers and users are also changing.

Academics and others concerned with methodology have drawn awareness to the fact
that the process and procedures of research, can and do omit aspects of use and
consumption integral to the 'actual' or 'real' experiences of people. Sys-log in some
respects develops from previous quantitative – and to an extent mechanistic -
approaches in human research such as questionnaires, surveys, and participant
observation.463 The ethnographic and interpretist style studies deny mechanistic
positions even in their approach to 'subjects'. Rather then viewing them as
'knowledge or information providers' they are viewed as 'co-researchers' in the
design, research and development process.

'Virtual' renditions of use and consumption activity always impact upon marketing
and design activities. They have also - perhaps more insidiously - influenced policy
decisions shaping the overall culture and modes of governance of a country or state
463
It should be mentioned that these techniques also have their online correlates, and can provide
powerful means by which user feedback can be automatically processed by the system to provide
analysis and graphical or numerical renderings of data.

357
(such is the basic case of Miller, 1995 when he critiques the privileged position of
economics in influencing government policies). Virtual renditions of use and
consumption created by model making - in the case of economics the 'rational
decision making activity of the consumer - could be said to be evolving a different
model which takes into account the sense-making activity of individuals towards
goods and services. The emphasis here is on the developing awareness of the
experiential aspects of the product. The Japanese preference for experiential
approaches to market testing, for instance, has often been cited as a cultural disdain
for market research. Experiential approaches to understanding the consumer and the
use process also features as the underlying epistemology of beta testing software and
technology and marketing trials.464

It is widely accepted that the creation of working images of use and the user, have
played an important role in product development, whether as products of formal or
semi-formal research projects, or as reifications of reflexive projections on behalf or
mangers, designers and marketers. These working images of use and users are imbue,
inscribe or otherwise shape the product in concert with a complex of other influences
- purely technical potentials and constraints; standardisation and regulatory issues;
and further manifestations of the firm's desire to promote its identity and public
image (such as design specifications supporting branding and the look and feel of the
product etc., see below).

Industrial/market level

User-consumer research methods

Intra-organisational level
Reflexive images of use and user Brand image,
Industrial Trends and negotiated and shaped by current trends in
state of development available research knowledge aesthetic product
of component Characteristics, design
technologies features and

T
functionalities of Look and feel of
product competitors‘
(including costs) products

464
It should be mentioned that, as in the case of the Cambridge Trial, trialing systems promote
opportunities much wider than only the provision of semi-naturalistic environments for discovering
participant perceptions. They offer the opportunity to test out how the technical and social
components ‘fit’, and draw attentionStandards
to problems of organisation,
and relations technology,
regarding product category and implementation.

358
Integration of brand
identity into product

Product’s adherence to
standards and regulations

Fig. 8.3 Summary of influences on product design and development ("T" represents a central
focus upon technology) (After Molina, 1987)

Technical standards represent codified knowledge that may easily be applied to


product technical and functional specifications. They also can provide 'hard' evidence
of impacts to the firm in terms of complexity and costs. However, the impact of user
knowledge and the disparity of use and user images in the processes of feature and
functionality design represent much less tangible elements. However, a price has
been calculated regarding bad usability (Nielsen, 1993). And added to this is the
potentials for total failure of the product in the market place, hold ups in the product
development process, and outstanding costs incurred through help desk inquiries and
product replacements.

The ingredients included in the circular model of user knowledge propagation,


communication and implementation, and the model presented above provide a static
picture to what is essentially a temporally dynamic process socially and over the
process of product development. This suggests the natural polarisation existing
between the maturity of a product or system, and the need for user input to the
design. Also, the state of development of a new product or system can serve as an
index to relative knowledge integration. In a review of the product development
literature, de Bont (1992) differentiates five consecutive phases in product
development process: strategic, idea generation, idea/concept formalisation, product
development and market introduction. In each phase, specific consumer information
can be used to optimise the process and reduce the risk of wrong decision. A radical
innovation may, depending on the nature of the technology or service, require more
or less consumer-user involvement in any co-design process to guide development or

359
provide a sense of security regarding market potentials (see table 8.2 below). This
may have been, but was in fact not, implemented on the Cambridge Trial.

Table 8.2 Consumer information useful across the different phases of the product development
process, and the research methods that may help to acquire it.

Phases in the Consumer information Consumer Information and knowledge


product required research to be developed
development methods
process
Strategic Phase Market description in Gap-analysis Awareness of market opportunity
terms of perceived – information such as size, level
competitive products and of competition, profits and
their consumer evaluation market-company fit (Urban et al.,
1987)

Unfulfilled consumer-user 'needs'


Idea generation Ideas that combine with Consumer-based 'listening to the voice' of the
internal strengths of the Idea generation consumer-user
company with market Need assessment brainstorming in company
opportunities assessing the intensity of needs
Idea, concept Acceptance of ideas or Concept testing Information which links the new
Screen and concepts (functions)' product idea with internal strength
evaluation evaluations of several of the firm with a market
combinations of attributes opportunity

Concept testing with consumers


Product Acceptance of product Product testing Development of a prototype,
development/prod evaluation tests with consumers
uct evaluation
Market Market-entry strategy Market testing Marketing tests prior to
introduction introduction

Assessing the new product in real-


life market environments

The state of development of a product can also provide some metric of what
knowledge is required by whom (in the organisation) at various periods across the
development process. The concurrent developments of product design, production
and manufacture, with marketing can cut time to market, and fulfil the desire for
agile approaches so favoured in post-Fordist manufacturing. It is clear that a process
which integrates all aspects of design, production and marketing and which
integrates the user in all aspects of this activity will produce products which
minimise risks regarding place and profile within the market.

360
This draws to attention the wider contexts of the dynamic forces of social and
cultural change, design, innovation and evolution of business practice.
 The nature of business is changing.
 The nature of consumption and accessing goods and services are
changing.

 Standards are created and evolve, sometimes independent of market


diffusion.
 Technologies and their complementarities evolve within systems of
production and cultures of use.

It is within these various contexts that CU may be envisaged as making a


contribution in an applied sense. It does so in its view of consumer-user research
benefiting from approaches which promote the recontextualisation of the use process
of the product as a situated act both within the cultures of production and within
cultures of use. It addresses the parsing of the organisational needs for consumer-user
knowledge with the state of development of the technology, and the evolving needs
and perceptions of the user-consumer. The table overleaf outlines some of the
research questions which can form a CU study.

361
Table 8.3 showing general research questions arising from the interaction of use, usability,
usage and usefulness

use usability usage usefulness


use What is a users Does the Does the use of By which channel has
overall impression technology adapt the technology the consumer been
of the technology in to cater for the mainly fall into informed of the
each situation and increasingly functional, technology and drawn
circumstance of sophisticated exploratory, or to use?
use? user? recreational
usage patterns?
usability Is the system easy to What are users How are usage Are usability problems
access and use on overall patterns affected made transparent by
the initial attempt? impressions of by the usability the perceived and
the usability of of the actualised usefulness
the technology technology? and value of the
initially and over technology?
time as they
develop
familiarity or
expertise in
using?
usage Which aspects of the Does usability Can users Does usefulness
technology are most deter the initially perceive become 'transparent'
time spent upon in formation of periodic use of through continued
initial use? usage patterns? the technology use?
Subsequent uses? integrated into
their current
activities? How
does this vary in
reality?
usefulness Which aspects Is usefulness of How does the What are users overall
service appear most value of the formation of impressions of the
useful on initial technology usage patterns usefulness and value
confrontation with negated relate to the of the system over
the system? /attenuated perceived time?
through usefulness of the
good/bad system?
usability?

Drawing the salient points together

For the remainder of this chapter, I wish to surmise the main points as raised within
the book. I shall put forward a tentative model of how studies of the user and the use
process may map with the various components influencing product development may
be integrated which accounts for a wider scheme of organisational and socio-cultural
influences.

There are two particularly important social factors to be taken into consideration in
the design and use of new media such as TV-centric networked technologies. First,
as previously explored, the workplace constitutes an identifiable and bounded system
of social interaction, something which places cultural distance between the users of a
system, and those who would analyse or design it. This is a gap which interpretivist
research methods hope to promote awareness of, and eventually, perhaps indirectly,
bridge. This suggests the two way research focus demanded in such a bridging study
- one is understanding the nature of the organisation and individuals who comprise it,
and matching this with appropriate levels and types of knowledge from the site, field
and sub-cultures of use.

To begin to understand how users of a system and the system itself will interact, one
must understand the social elements within the aforementioned bounded social
system. Such a bounded system existed in the Cambridge Trial, which included
elements which were technical such as the technology and the content material of the
trial, as well as participant's televisions, hi-fis and so on; and elements which were
human and social in nature - the company personnel, technology and content
partners, the service nursery and its various sub-groupings, the participants, their
families, friends etc.

The key to such understanding lies in understanding language, and more specifically,
the nature of language use. It is in language, the primary basis for interaction
between discrete individuals, that cultural and social patterns are manifest, and upon
which they depend. Language is in turn entirely dependent on social interaction
within the wider setting of the world.

The presence of commonalties in terms of conceptual and cultural background


between users and designers, or at least an awareness thereof, raises the possibility of
design which takes genuine and unbiased account of users' social context - the
particular sub-culture within which they inhabit. Such a task becomes immediately
more plausible once the notion of interaction between effectively disparate and
incommensurate conceptual schemes, as tends to be implicit in ethnological
approaches, is tempered by an appreciation of common ground. With the support of a

362
coherent interpretivist semantic theory, one can then hope to make explicit the sort of
action one needs to take to achieve such an unbiased understanding of users, or what
it is that one must know about users - that which is silently embedded in participatory
design, and other 'soft-systems' style approaches. Any cultural entity is changed
when viewed from a different cultural context.

Cultures of use

The focus of the book has concentrated on media technologies, which as mediums,
maintain particular features and funcationalities that expand upon, and even
challenge, the notion of use as it is used with respect to tools and other artefacts. Use
is a divided notion when considering media technologies. It can refer to the artefact
itself - television, hi-fi, telephone and so on - or it can refer to the sense-making
activities involved with its content consumption, viewing, reading, speaking,
listening. Use here refers to the value involved in transacting with systems, software
or other people via - or mediated - by technology. In many cases electronically
mediated experiences seamlessly incorporate with all other forms of experience -
whether communicative, informative or relaxing, simulating, enjoyable or even
frightening, disgusting, and/or fun.

The development and subsequent use of a new technology is quite obviously


involves innovating upon both technology as well as practices. Initially when the
technology - such as an internet STB - enters the home the first experience of use can
be awkward. However, if the manual is well written, and or if mode of use is self-
explanatory the experiential aspects of its content becomes the object of focus. This
stage represents the initial site where formulations are made regarding potential
usage patterns and registering the usefulness of the technology and its contents. This
may be marked by the finding and subsequent bookmarking of several particular web
sites of interest, sites which indicate some form of updating suggesting beneficial
accessing on a regular basis. Usage is also accessing sites on particular days at
particular hours.

The use process as described in this example correlates strongly with Silverstone's

363
notion of domestication further elaborated upon by commentators on technology
such as Sørensen (1993). Domestication is the process of sublimation by which
artefacts and technologies incorporate within everyday life, experiences and
activities. I wish to suggest that the anticipation and actualisation - by consumer-
users and designer-producers - of the use process is the mechanism of domestication.
It is a process that begins with the genesis of the original design (from a shared
universe of possibilities), through ideation, to actualisation, iteration, prototyping,
testing and diffusion.

New media innovations and their use, both enable and displace, expand and contract,
amplify and reduce experience. While not being produced in a social and technical
vacuum (as they are reliant on various interest groups and other technologies for their
components and their manufacture), they likewise do not enter a vacuum when they
reach the home. They must accommodate within the existing regimes of technologies
and their functions, social practices and individual consumer-users perceptions of the
world. Technologically, in the case of television centric technologies this means most
notably the television. Where this is situated within the household, what the
household constitution is, the various rules of household governance etc. each play
an interactive part in fashioning attitudes towards the new technology as well as
instigating and shaping patterns of use (see fig 8.4 below).

Use Process

USE
Individual Situated Contexts

Psychological dispositions: Socio-cultural


Attitudes, emotional exigencies
responses, beliefs,
USEFULNESS projections, etc. USAGE
Everyday activities and 364
H
USABILITY
lifestyles
Technologies already
used and
‘domesticated’

Fig. 8.4 The influences on the domestication process ("H" places the central focus on the home,
or rather a given individual's perception of the home)

Technologies that work together represent a particular product class. PCs must
connect to printers, television sets to hi-fis, and in the 'smart home' the continued
diffusion of 'jelly beans' - the industry definition of non-computer resident
microprocessors - suggests the range of connected or networked artefacts will rise
exponentially.465

However, Baudrillard's (1988: p.31) suggestion of the consumer caught up in "a


calculus of objects" in the act of viewing goods within a shopping mall, the same is
true for the technologies and objects which we live with. Much has been written
regarding the symbolic attributes of television and other media technologies (for
instance, Silverstone and Hirsch, 1992; Silverstone, 1994), however there is little
which addresses the way in which existing technologies, or existing domestic
technological constituencies impact upon adoption (and interpretation) of the new.
This is quite clearly an open area for investigation, which can be extended to
investigation of how adaptation leading towards new use practice, feeds back,
impacts and otherwise redefines existing practices and instances of use, usage, and
perceptions of usefulness.

465
There is estimated that there is something in the region of 6 billion chips existing in various
artefacts and technologies. The suggestion is that we are moving from ‘crunching’ to ‘connecting’
Kelly (1997).

365
Cultures of production

As in large technological systems (such as suggested in the analysis of commentators


such as Hughes, 1986), the success of the technology such as Edison's electric mains
relied on sympathetic consideration for the systemic qualities of each individual
component or technology. Brands are the symbolic correlates of technologies that
indicate to the consumer the potential that technologies will be compatible in look
and functionality. And standards (in networked multimedia such as DAVIC, and
DVB which attempt to ensure cross compatibility between platforms, in quality ISO
9000, and in CE marking which indicates that a product complies with harmonised
EU requirements for safety and health).

Raising the profile of the individual in sociotechnical constituencies

Much of the work on sociotechnical models has tended to concentrate on analysis


and evaluation of large industrial processes. Molina's sociotechnical constituencies
for instance has been applied to transputers and the transputer-based parallel
computers (Molina, 1990); and European IT programmes and initiatives Molina
(1992, 1994). However, in a later paper (Molina, 1997) learning from the user was
incorporated into the constituency building process of a mobile computing device by
applying the contextual usability framework.

The underlying thinking here is that some commentators have drawn attention to the
fact that many technologies manifest as configurations of previous artefacts,
developments, knowledge or expertise. Clipson (1995) views this process indicative
of an essentially techno-centric situation of "technology building on technology,
where progress is not a random affair but a synbook of what went on before." (p.103)
A prime example of such a technology may be the Om STB - most definitely a
configuration of previous developments, knowledge or expertise, and it these
configurations which largely dictated the immediate strategies for design, and visions
of the potential markets. However if one considers technologies from their use
perspectives, one would be mistaken to say that the STB was simply a re-invention
of the RISC PC. In a technological sense it was much more than the sum of its parts,

366
and its was at this point where much more social and cultural elements fuelled
visions and business orientations.

The sociotechnical model posited by the Tavistock Institute has been based on the
manufacturing process and also at the lowest level of organisation (Heller, 1989).
Even tools such as QFD can be criticised as offering only a token involvement of
consumer-users at the very earliest stage of development, most of the process is
based on translation of design, production and manufacturing measures. On the basis
of this such models may come under the criticism that they are inherently
technologically orientated, or techno-centric.

Molina has explicitly privileged technology in the constituencies model by often


placing it as the central focus of the constituency. Such a position has an
interpretation that it itself has created and manifested all the conditions which
surround it, that it is the 'navel' of all the other elements. This compares with other
models in studies of technology such as actor-network theory (for instance Callon, et
al., 1986; Law, 1992) where the emphasis is to balance consideration of social
relations of individual human actors with non-human, non-individual entities – a
more distributed view with multiple foci of analysis and importance in a study.
While this can contribute to a consideration of a development process considered as a
whole system, it puts pressure on the researcher or analyst using this method to
attach proper weightings to the component elements (cognitive, social or technical).

In sociotechnical constituencies the dynamism of the social relations is anchored


very strongly to the state of development in the technology, as if this development
was all that motivated and created inertias. He sees that technologies are indeed
social creations but argues that "many of these social creations evolve characteristics
which tend to remain stable for long periods of time" (Molina, 1997). He sees that
these stabilised characteristics have "critical implications for specific strategies of
innovation and development of technological capabilities." (p.1) This denies
instances of the dynamic relations between people and technology, where the
technology may remain stabilised in characteristic, feature and functions, but where

367
it is attributed with a varying cultural importance which can impact usage and
usefulness.

While it is true that technologies do appear to maintain certain characteristics for


some time - such as in the case of the television, where basic functionality and make
up has remained relatively stable since the development and wide diffusion of the
EMI system - it is also interesting to note that human and cultural conceptions of the
television have gone through considerable change and evolution. Also, the content
and technological means to production has radically changed, as has the cultural and
social perspectives that inform programme-making and presentation.

One of the artefacts of this co-evolution or mutual-shaping process is the notion of


domestication, the way in which artefacts integrate within the practices and everyday
life of consumer-users. This process has also been suggested as a kind of 'co-
consumption' - technologies are consumed by users, and users are consumed by
technologies - in so far as technologies "get our attention, have us react to them and
to become occupied by their abilities, functions and forms." (Sørensen, 1993: p.157)

Larger sociotechnical environment

The look, feel and operation of the existing technologies in the household

New technology and use


Wider concerns
regarding The look, feel and operation of the
new technology Household or
technology social group
Individual collective
Science fiction Public image of the perceptions
new technology and its perceptions of the

T
etc. of the new technology
function propagated technology and its
through rhetoric use
Use benefits of the new
technology

Use benefits of existing technologies

Fig. 8.5 The sociotechnical constituency of domesticating technology

368
Within this process certain uses emerge for these technologies, certain features and
functionalities are of use. Conversely, certain uses are not available, convenient,
permitted, easily accomplished, or affordable, and it is these instances which
characterise and accent resistance towards domestication, the patterning of use
(usage). These draw awareness to the technology's capabilities and limitations.
Providing the technology's designers and producers can capture and realise these
situated and actualised instances of use they can develop iterations on the design
which may be able to open areas of the technologies potential which may otherwise
lay dormant or a mystery to the users.

As I have endeavoured to detail in the preceding chapters, there has been a very
significant development in the way in which the act of using television has evolved.
By placing the human being - the user, consumer, reader, viewer, audience member,
research subject etc. - at the centre of a constituency, we can relocate research
perspectives or technical, economic, institutional and legislative changes relative to
the individual or individual group of consumer-users.

Interview, sys-log, questionnaire, focus-groups and so on . . .

Ways of findings out


Policy,
Company, (research methods, approaches
Strategy,
Organisation, and theory)
Design Iteration,
Institution, Product
Designer, Development,
Marketer, Who wants to Why they want to Market
Advertiser, know? Development,
and so on . . . know?
I and so on . . .

Stimulus (technology
and/or content)

Set top box, progam/mes, remote control, user interface and so


on . . .

Fig. 8.6 A human centred sociotechnical constituency ("I" is for the individual, their
perceptions, apprehensions etc.)

369
As in sociotechnical constituencies which place technology at the centre of the
constituency, the above model places the individual human being at the centre of the
constituency - be they user, consumer, viewer, reader, research subject - known,
constructed or identified by any other title, category or label. This draws attention to
the way in which their image, virtual self or 'alter existence' is socially and
semiotically constructed by different companies, organisation and institutions or by
certain individuals. Further, it is a route by which the way in which such
'institutionalised images' may constrict, impact and otherwise affect research
processes, and indeed policy and strategy regarding product development and design.
Such as was the case in the Cambridge Trial outlined in chapter 6.

The 'individual' and the notion of 'society' are constructs which have formed the
substance of study by social science. Various schools of thought have implicitly or
explicitly advocated different epistemological and ontological positions which have
characterised and polarised the notions of the social and individual over the last 100
or so years. These positions have also been reflected within disciplines manifesting
as rifts between the use of certain methods over others. These rifts have obfuscated
the value of social science method when applied to 'real world' commercial situations
of evaluation, where matters of 'correct' methods and their implementation are
relegated under pressures for 'easy fixes', rapid development cycles and/or 'lust of
result'.

The reasons why organisations wish to develop knowledge of people are multifarious
depended on their business and development strategies. What they can learn is
dependent on the state of the stimulus - technology or content - presented to them in
the form of prototype or mock-up. Prototyping has many similarities except the
prototype system is built and evaluated in order to construct a detailed specification
for the full system. It is further dependent on the method, approach and theories
involved in the research process - themselves contingent on time, resources and
access.

370
Instances here include the perennial debate between purist psychological perspective
of the world, compared with purist sociological viewpoints, and the difference
between quantitative and qualitative research approaches. Heller sees that they were
aware of the bias that may appear in its development of the Tavistock sociotechnical
model:
"Although the [sociotechnical] model was developed by social scientists, it
does not attempt to preference the social component. Human requirements
cannot be maximised without damaging the potential technological
contribution in most cases. Usually, both have to be suboptimised in various
degrees, depending on the contingencies of the situation, in order to achieve
an overall optimum." (ibid)

Westrum (1991: p.13) provides several alternatives through which technologies and
societies may parse;
" . . . a thing must fit its purpose . . . We can change the society to make it
more adequate to cope with the technology; we can refuse to deploy
technologies which our current social institutions can't handle; we can try to
develop new sociotechnical systems that jointly optimise technology and the
human factor; or we can simply let the system go on as it is now and suffer at
some future time the consequences of a serious mismatch between complex
technologies and the adequacy of our social institutions to handle them."
(ibid.; p.13)

Such a focus is at the centre of the recent interpretative approaches to researching


audiences as well as the impact of media messages. Such approaches have been
profitably employed in usability studies of computers and consumer durables, where
the sociological contexts of use are increasingly understood to be as determinate to
framing the use process as making product easy to use.
" . . . just as it is important that technologies are designed for their users, it is
just as important to realize that users may have very different reactions to
them . . . Linking the skill of the user to the technology is the task of social
institutions." (ibid.; p.231)

More recent work at Edinburgh (Project Newspad) has witnessed a preliminary


attempt to locate CU within the frame of the sociotechnical (Nicoll, 1995; Molina
and Nicoll, 1996; Molina, 1999). The practical benefits of such a fusion opens the
potential of linking appreciation of consumer-user perceptions of a technology as a
distinct element placed against other influences to the design process. As products

371
become more 'smart', as customised manufacturing approaches develop, or 'on-
demand' entertainment, information and delivery services become the norm in the
home-based consumption, consumer-user intelligence, user-producer co-design and
organisational development must become integral if not absolutely imperative. Araya
(1995: p.237) suggests that : "in the name of "enhancing the world" the proposals for
Ubiquitous Computing constitute an attempt at a violent technological penetration of
the everyday life." (Author's italics)

This must include facilitating, understanding and evaluating the whole range of
emerging interactive relationships that are enabled and constrained by new forms of
feedback via technological systems. This is in stark contrast to previous design,
manufacturing and marketing research practices, as it includes the interactive
relations not only between people and goods, people and organisations, but people
and the systems itself, and the emerging technological and organisational
environment. For instance, selectivity and choice on behalf of the user further shape
the technological environment:

" . . . when someone buys an already built house, its design reflects the
actions of the builder and the house's previous owners. The longer a person
has the house, the more its contours, colors, furnishings, and general physical
condition reflect the owners characteristics. And then there is the basic fact
that the owner chose the house in the first place. We could go a step further
by suggesting that the nature of the eventual home now shaped the
technology in the first place; put another way, the technology was designed
for its intended users." (Westrum, 1991: pp.172-173)

The ultimate goal of such a project balances the natural leanings of technology
companies towards a techno-centric view of use, usability and usefulness, with one
that is focused on the situations of consumption and use of their products.
Conversely, it creates possibilities of indexing ecological approaches to developing
consumer-user intelligence indexed to the stages of product development.

Such an approach keeps in mind the notion of the sociotechnical as posited by the
Tavistock Institute. It represents a means of analysis of both the particular socio-
technical contingencies of use within the wider universe of discrete and obvious

372
influences arising from the constituency in which a technology is created and
developed.

Heller argued that in order to effectively balance both the social and technical
elements of constituency in order to produce appropriate recommendations, policy or
advice there had to be some elements of research: "In nearly all cases of some
complexity, it needs research to discover the appropriate contribution of the two
components." (p.24) The suggestion for a pragmatic application of CU and
sociotechnical constituencies suggests, like in evaluation studies, a two-way reflexive
research approach where a firm, its organisation, practices, inertia and background
are taken into account before embarking on a process of consumer-user research.
This would give a clearer perspective on which information, would be most
beneficial, for whom and why.

It would appear then that CU as an applied consumer-user research approach, would

benefit from being placed against the wider constituency of what motivates, shapes

and otherwise influences product development and design. Essentially, this translates

as a mapping of user research as a singular element into the constitution of the

features and functionalities of a product. Looked at as a symmetrical relation we can

consider the producer-consumer, designer-user, product-lifestyle relationship thus;

Table 8.4 The elements linking production, design, and product with consumers, users, and
their lifestyles

branding, advertising; well known, reputability,


PRODUCER other products in product range symbolism CONSUMER
similar and related products in where, when, what
market place type and quality

product features and usefulness to current activities,


DESIGNER characteristics potential novel use niches USER
functionality usability, desirability
product aesthetics
functional relation to pre-existing how easy to access news between
use of similar products; platforms;
PRODUCT cost relationship to consumption of price of news between say internet USER-LIFESTYLE
similar products; services, and watching TV;
Symbolic attribution of possessing status through having, or being able
product. to use, being able to show etc.

373
The tensions inherent in the above pivot around the creation (or 'encoding' to use

Hall's model) of characteristic feature and functionalities, and their interpretation (or

decoding) by the user-consumer. The manifestation of this interpretative or decoding

process is the use process, which I will suggest is a central mechanism to

Silverstone's notion of the domestication of technologies.

I would further suggest that the use process lies between the two distinct

constituencies of the contingencies of design and production (as well as distribution)

and the contingencies of use- the technology-centred constituency and the human

centred constituency. Consumer-user research as traditionally practised falls between

these constituencies and is intended to provide the company and its personnel with

useful perspectives of the user-consumer. User-consumers on the other hand exist in

a world which consists of many activities, pursuits, and interests outside of use and

consumption of particular technologies. Worlds which comprise also of many other

technologies, which may be competing for attention, and which have elicited their

own impact on the use process of any other technologies entering into the arena of

lifestyle and the living space (see overleaf).

Between these two constituencies, which are constantly evolving (although quite

clearly with different dynamics) there exists a tension best envisaged by the needs

and requirements (or the capacity for needs and requirements) for improved features

and functionalities upon what they already have, do, want and use. While some of

these needs and requirements may be tangible (i.e. faster, smaller, more power etc.)

some may be difficult to identify lacking a common currency in description (such as

peoples interest in ever increasingly more convincing cinematic special effects). The

dynamic qualities of the model dictate that no phenomena or product is totally

'discontinuous' or 'radical' - it possesses some analogue if not in mode of production,

374
at least in mode of use, which dictates it continuation from already existing artefacts,

objects or services. The importance of the model lies in its recontextualisation of the

product, (its development and subsequent abilities to be reconfigured, customised

and adapted) with the use process of consumer-users.

375
Designer/Producer/Technology Individual Situated Use Contexts
Socio-cultural Socio-cultural
exigencies and motivations to develop exigencies and motivations to consume
product and use product

Traditional Realm
Expertise developed Projections of of designer-user Everyday activities, Technologies already
through adapting STB and
experiences using it T
market/use/users/
consumers
research (i.e. usability),
and producer-consumer
practices and lifestyles H used and
‘domesticated’
research (i.e. market
research) Individual dispositions:
Visions for technology Attitudes, emotional
responses, beliefs,
projections, etc.

Use process as an act of interpreting technology


Meaning -making of
Design iteration, Use technology via
Projection for functional advertising
development Meaning-making of
Projection for marketing Usability technology via use
U
Usefulness
development Meaning-making of
technology via
domestication
Usage

Fig 8.8 The use process as arbitrator of communication between designer and user, producer and consumer,
product and lifestyle. Usability recontextualised as a naturalised part of the use process, rather than as an index of
how assessable and usable a technology is with respect to the intentionality of the designer and producer.

376
Thus, if we are interested in technical objects and not in chimerae, we cannot
be satisfied methodologically with the designer's or user's point of view
alone. Instead we have to go back and forth continually between the designer
and the user, between the designer's projected user and the real user, between
the world inscribed in the object and the world described by its displacement.
(Akrich, 1992, pp.208-209)

When a radical idea (compared with existing practice or modes of use) for a new
media technology emerges at the concept level, it is open to misinterpretation or
misrepresentation as it flows through the conduit of PR, journalistic reporting and
word of mouth. As suggested in Chapter 1, it maintains its 'interpretive flexibility'
both as a technology and a technological potential. It is on this level that certain
views can be developed, views which skew opinion one way or another, towards
utopian or dystopian perspectives or the innovation, predictions of determinism and
voluntarism regarding its use and impact, its benefits and its social effects. These,
without doubt shape peoples' attributions of a technology, product or service.

Product concept testing has a poor record in predicting eventual market successes.
Perhaps this is simply because that it is on the conceptual level - most privy to
interpretive flexibility - that it is most easy to promote a scenario which visualises
the accommodation of a product into everyday life. 'Doing so' would appear
considerably harder to enact, or even it were to be contrived – as it was in the
Cambridge Trial – it would negate the kinds of pressures and anomalies that plague
or advance the domestication of 'successful' products. Alternatively, it could be the
poor representation of the contrived relationships, as well as alien environment,
between market researchers, the product and the 'representative' sample of
anticipated consumer-users.

A trial such as the Cambridge Trial, offers more than simply a unsupported concept.
It offers a concrete range of phenomenon - technological hardware (STB and remote
control), a range of content and service options, and accompanying discourse
including recruitment paraphernalia, questionnaires, invitations to Om's headquarters
for 'user evenings' and so forth. All these phenomenon, as well as the fact people
were actually living with the technology provides a real opportunity to track how
they make sense of the technology, as well as how they relate it their lifestyles, and
the other technologies they use day-to-day. It is also an opportunity to explore how
the technology prompts them to consider its actual performance and use, and what it
could be through additions or alterations to its current state of development.

The trialists

The user research was conducted in collaboration with participants on the Cambridge
Trial. As was indicated in the previous chapter a series of interviews were conducted
on user-participants on the CT between the 23rd and 24th of July 1996. The 11 (of an
intended sample of 12) households were selected from the 66 participants in the trial.

Marcus Penny – the content and services manager - viewed that within the relations
of user to marketer there lay an issue - that of bringing the user's interests into this in
an appropriate way. In particular the issue that Penny faced was financing some
consideration of the user's interests at all. Everything that he did had to financed and
justified in some way. At that moment he was financing and justifying it on the basis
of service providers and Om, learning from the process of the trial and the lessons
that this taught in terms of how to build businesses in the future. What value was
there to service providers in consumer's interests? Who were the institution or firm
who would pay for this?

One could imagine that a public sector institution such as the ICT or some of the
consumer watchdogs could become interested. On the other hand perhaps a
professional organisation would be better placed to conduct independent research.
Penny felt that this is a generic problem with all products and all services in that the
users in the end don't finance it, hence at the creation stage you've got to deal with
the people who are financing it who are the ones who are actually interested and
engaged in it at this point. There the people putting their time and effort and
investment in on the basis that they produce services to users. Out of that there is a
motivation for them to get a real understanding of what the users actually want.
Penny thought this is something that you could sell to them if they understood what
users really want they will do better in the provision of services.

378
This seems to be one shortcoming of group decision making processes that is classic
- some processes apparently give rise to spontaneously good products as was the case
of the original demo STB. A worst case scenario is also possible however where you
have got a bad product which fails to satisfy both the collective needs and individual
needs of the group. This may be true, as in the user/marketing working group case
where the product is a research approach, the difference is this product is knowledge
and not a purely technical system which either works or not.

The notion of 'users' and 'consumers' were an inextricable part of the transactions
which took place between the original project team and senior managers and funders.
The former being convinced that there were indeed latent mass demands for
interactive services, while the latter felt that only a trial could illustrate fully the
technical potentials and credibility of the technology and concepts. Users featured
strongly again when they became collateral in the transactions which took place
between Om (and those responsible within Om for the trial) and potential PSPs.
Bounded with the notion of developing and learning core competencies needed for
providing interactive content and services, PSPs invested in order to learn of the
organisational problems involved with trials and also to learn of what 'average'
consumers would make of the system:
"The presence of NOP (National Opinion Polls) on the Trial has facilitated
the gathering of detailed user feedback. The initial data showing usage of
services by Trial participants, along with their reactions to their experiences
constitutes a goldmine of information for other companies wishing either to
participate in other i-Tv Trials or to provide content or services . . . Indeed as
such it allows the consortium to evaluate the revenue potential of such
services for roll out in a wider context and even for eventual commercial
deployment on a regional or national level."

Om promotional literature

Identified as a crucial part of the learning process of the trial, was for firms to
understand and gauge the impact of their individual presence on the system. As such
the 'public' stage users (as opposed to the designer-users) were to a degree
'commodified', as user access and research was added as part of the value enticing
companies to join the service nursery in the first place. Access to such information

379
was unfortunately mediated by a dysfunctional group which was perhaps indicative
of some of the deeper problems of information flows, management and governance
involved with the trial as a whole. Clearly, there was not enough effort (or probably
resources) in building the sociotechnical constituency of the trial.

On the subject of user research, and in particular, the issue of on-line questionnaires
etc. Om's Marcus Penny was adamant that there was a tension between explicit and
covert ways of realising what people are doing. His preference was for inference
from what they do, or actually use, rather than asking them questions directly. He
was of the opinion that it is more liable to lead to real answers and its something
which we can do for the first time due to the nature of the system. The sys-log data
produced by the system would reap data which would show when the STB was
activated by which household, which service/programme was watched, and what the
interaction style was. The detailed data of how they do respond to choices presented
on the screen. This was an integral part of how Om elicited NOP's interest in the
system.

Penny claimed that Om will be working with them to make the correct inferences
from the sys-log data and then increasingly to tune the choices that they would
present, working in an iterative fashion till the right inference is made. Om would
also provide questionnaires on screen for people to do, but Penny felt that there is
stages beyond that which they wanted to get to. This stage is characterised by not
providing questions, but rather providing experiences or experiential choices -
vignettes - and then monitoring there reactions. He viewed that there was a 'whole
new approach', an entire new way of market research, which surpasses problems of
interpretation inherent with questionnaire use.

However, it is clear, as will be illustrated in the following chapter which outlines trial
participant's experiences of living with the system, that user-centred research/design,
and particularly 'inferring' from sys-log, is made more problematic when you do not
have a fully operating system with all its branches and avenues open. User's
functional and exploratory aspirations are confined, and can only reap understanding

380
of how they coped (or did not cope) with this confinement. However, even with the
limitation of the system Penny viewed as an opportunity to get users involved with
the design of content and services at an early opportunity.

Approach to users

The objective of the qualitative user research was to understand the trial participant's
understanding of the technology. To begin with this to uncover something of the way
in which participants came to learn of the trial and the technology. This can impact
on 'first impressions' of the technology, in terms of its usefulness and motivations to
use, as well as their individual approaches to solving usability problems. It is not
difficult to imagine a scenario where a highly enthusiastic installation engineer may
help to carry impressions of the system as a panacea to a number of common
complaints concerning the existing broadcast media.

Likewise, an engineer may be evasive in answering specific questions regarding


issues such as when content material will change, when certain services will become
available and so on. Any communication between the companies involved with the
trial and the user-consumers will influence perceptions, above and beyond that of
people interacting with the technology, content and services directly. Penny
remained 'very aware' of this issue: 'you can't do objective research because every
contact which you have with the users has an effect on them'. However, in the spirit
of evaluation research, the main issue that could be drawn from a useful research
project would be to 'make some assessment of what direction and what magnitude
that effect is likely to be'.

Penny felt that 'certainly NOP should be aware of these sort of issues'. It was
considered that they were there to 'hold the ring and mediate all these sort of
questions from the individual organisations and as far as the individual organisations
are concerned in their motivations the users are just a means to an end to answer their
questions. Above and beyond NOP's involvement of managing and mediating the
orientation of the user research, they were involved in the CT to learn about the new

381
possibilities existing for market research using i-Tv.

Simon (1969/1996) draws attention to the question of how a simulation can generate
regarding new knowledge. Simulation is used to achieving and predicting the
behaviour of systems. To a large extent, and as was suggested in the previous chapter
which detailed more the social process which led to the construction of the user
research, and chapters 6 and 7, which illustrated something of the social construction
of the trial, the users were viewed almost as intelligent parts of the system. They
were viewed as data generators, from which inferences were to be made regarding
tweaking the system, its look, its offerings, its functionality and so on. The trial
content was only ever a demonstration. It had many promised features which never
materialised and this was the single most represented piece of feedback consistent
across all interviewees. Simon relates two assertions about computers and
simulation:
1. A simulation is no better than the assertions built into it.
2. 2. A computer can do only what it is programmed to do.

Applied to the Cambridge trail this may be taken to infer that the point of view of
Om, and even the companies involved in the trial, viewed that since the trial was
only a simulation - a demo - that no useful information could be drawn from
research. The user research has offered a glimpse into the non-rational, very
individualistic lifestyles which are recognised within consumer research, and only
now being recognised by those who have for some time developed technology which
is to be situated within domestic locations and real lifestyles.

Marcus Penny viewed that i-Tv opened the potential for instant feedback from users.
They would provide a marketing or product/service development department with
the opportunity to test ideas out on user-consumers, and the feedback would dictate
the adoption of the new product, service or process:
". . . most businesses are producer businesses somebody sits there in a room
cerebrating creating something and there is a very, very long chain down to
pushing it out, and the feedback back from users back to here is very, very
imperfect . . . an individual programme producer can create something test it
out and get some instant feedback . . . what will that do for the nature of

382
television?"

In a system which is highly dynamic, constantly reactive, and ever changing it could
be said that there would be little opportunity for things to remain stable enough to
make inferences or formulate and ask relevant questions. Penny viewed that this was
symptomatic on much wider cultural change - "we're coming to be in a reflexive
world . . . what happens is that you run the reflexivity and I think it gets to the point
of stability emerges its a question of managing through to that."

A picture emerges of the innovation of i-Tv not being driven by simply engineering
vision alone. Penny stresses the definite need for feedback, a symbiosis of
developing services and content with inputs derived from the user-consumer's tastes,
interaction styles and choices. He viewed that one of most important elements of
reorientation to this new way of doing business and producing media is that you will
simply not survive unless you take intimate account of the feedback and you run
your business ultimately on the basis of interaction and feedback.

Such a view bears relation with social shaping theories of innovation, opposed to
simplistic models of linear innovation, but rather recognising and bringing to the fore
feedback loops happening at all stages of the innovation-diffusion continuum.
Returning to the theme of order rising from chaos, he viewed that standards
formation arises from such crises, 'if you believe in the approach natural standards
emerge out of a dynamic process and are stable because the system keeps them in
place'. He sees that crucial to the role of the 'new manger' (one which is in keeping
with the new style of organisation) will manage processes of crises and chaos:
"its something which you cannot plan and direct in the way that your used to
it in a mechanical view of the world nevertheless there are structures and if
you understand the behaviour particularly in the moving from stability to
another . . . you can encourage that process . . . if you understand what lies
behind that stability you can encourage or interact with it . . . but you've
actually got to observe quite closely what's happening."

However, much like the vision for the self-governing service nursery, it seems that
this view remained somewhat utopian in its faith in the system as it stood, and in the
users to act as 'intelligent' parts of functions of the system. It is clear that from the

383
above sample of users that there was a plenitude in terms of people's reactions and
attitudes towards the system. Many of these would stand to confound and inferences
above and beyond the fact that their was little use of the system.

There were several themes recurrent throughout the case studies. Broadly these can
be broken down thus:
 Lack of content drove inactivity with the system

 Regardless of problems with content most people still saw value in such a
service providing there were programmes which would appeal

 Most people saw advertising as inevitable, however interactive


advertisements were difficult for them to grasp or imagine. As a concepts
they seemed to appeal, providing they did not interfere with the programme.

 i-Tv would not impact on what they viewed, rather it would enable them
more flexibility in their lifestyle

 It was quite obvious the family homes different from each other in terms of
uses for television, and that these homes differed from their extra-television
activities, in ways which would be relevant for the consumption and use of
particular services (i.e. some services simply did not 'exist' for certain
households).

 It was quite clear that interactive radio was of little interest to interviewees.

Erlandson et al (1993) point out that in naturalistic research analysis is continuous,


that the "analysis of data interacts with the collection of data" (p.130). This suggests
the flexibility inherent in this approach. Subsequent interviews are shaped by what
has been learned from previous encounters; interviews in process may change with
respect to what is being offered by the co-researcher. New opportunities for data
collection are sized upon as they occur and are considered relevant. Such an organic
approach, maximises the research process within real world and often chaotic
circumstances. Research and analysis are never fully complete, there is always
something, some angle which was not fully exploited or explored. This was the case
in the user research of the Cambridge Trial. The research process was in the end
compromised from its full potential. The reasons for this compromise maybe
summarised thus:

384
1. The employment of a semi-structured interview schedule. While this permitted
some degree of standardisation of the results, it also challenged the notion of
natural trajectories within the interview process. As such they were guided to
irrelevant questions (such as asking about there impressions of advertisements
which were not on the system), and generally swept along with the pace of the
questions as laid down by the schedule. There was some evidence of answers
being 'invented' or 'forced' for questions.

2. There was a definite lack of consistency between the interview styles of


interviewers. This resulted in some interviewee answers being closed down on
points they perhaps wished to emphasise, possibly due to the interviewer's notion
of what was relevant or non-relevant to the discussion. Different interviewers
have different ways in which they communicate with people in intimate places
such as their homes. It is quite easy to imagine that some researchers have
particular talents for making interviews feel relaxed and open, free to present
their 'genuine' impressions on phenomena, whereas others may unconscious act
to inhibit the free flow of thoughts and feelings regarding subjects. This is a
difficult problem which must impact to a greater or lesser extent much of human
subject research, and is itself an artefact of insurmountable individual differences
and experience.

3. Logistical difficulties plagued this project due to the inclusion of a third-party


firm for arranging interviews. As noted I experienced difficulties (fatal in one
instance) with my interviews, and it was only down to luck and the flexibility of
myself and the interviewees to reschedule and fit in the interviews on spec. It is
not unreasonable to imagine myself flying down to Cambridge on that day, only
to return with no interviews whatsoever. The use of such interviewee recruitment
agencies seemed common practice to NOP, who obviously use this company on a
regular basis.

4. Semi-structure interviews presume something of the communicative abilities of


the interviewees. Those who are more 'vocal' and can articulate in a much more
richer way than others may tend to dominate at the level of analysis, particularly
when this is done at the casual level. What was indicated from NOP was that the
interviews would not be subjected to transcript, and that for their purposes it was
only necessary to lift out sentences taken from listening to the recordings. Such a
method may leave itself open to reporting on the feedback from certain
interviewees over the subtler, but nevertheless relative, feedback of less articulate
or outspoken interviewees. Such a problem is of course framed within the larger,
more pervasive difficulties of the interview process as a social science research
implement, but attention to questions of interviewee articulation should perhaps
be made to frame each of the interviewees' responses. This could be derived from
realising the benefits of a more discourse rather than simply content orientation at
the analysis stage.

385
Chapter discussion

Yin (1989: p.23) suggests that case study research is an empirical inquiry that
investigates a contemporary phenomenon within its real-life context; when
boundaries between phenomenon and context are not clearly defined; and in which
multiple sources of evidence are used. " . . . . case studies are the preferred strategies
when 'how' or 'why' questions are being posed, when the investigator has little
control over events, . . ." There is little in the literature which addresses conducting
research within contemporary consortium environments. While, as an organisational
structure, they are hailed as an example of how the 'new economy' is effecting
business, they offer a particularly rich example of how organisational and political
manoeuvring can frustrate particular objectives, in the case outlined in this study, this
was the design and implementation of consumer-user research. 466 The case also
showed how a group consisting of large companies from quite different industry
sectors, encountered and attempted to cope with a radically new area of operations-
the production, distribution and understanding of digital networks as an alternative
channel for their products and services.

There is also an impoverished literature to date which is directly devoted to


understanding the potentials of digital networks in domestic locations, compared to
the vast literature on computing and communication in the workplace.

The Cambridge trial and its technology most definitely represented a vanguard
opportunity to come to grips with the kinds of questions that the new era in domestic
media could suggest. This brought to bear an interest in the contexts of interaction,
the main distinguishing functional component of interactive over existing forms of
television. And in particular the production and design of interactivity, its facilitation
and its reception.

In a purely technical sense there was little difference between Acorn's STB

466
This was the explicit aim of the PSPs, who after were solely interested in the business potential of
the systems.

386
technology and their network computer (NC), produced under the tutelage of Oracle
Corporation's CEO Larry Elison.467 In fact in many respects there is little essential
difference between either of these products and their immediate antecedent - the
Acorn RISC PC. Essentially what distinguishes the RISC PC from the STB and NC,
was similar to that which distinguishes a games enthusiasts PC and a standard office
PC - sound cards, graphics cards, RAM etc. Also, the shape, the design and the
colour of the box was different. These were each simply different boxes containing
different configurations of Acorn hardware, ARM chips and input/output (I/O) cards.
The question arises – were Om selling 'interactive television' or were they selling
ARM chips? The case has suggested that it was in fact both, but with an emphasis
upon the latter. For the large part notions of 'user-research', 'interactivity' even
'interactive television' and 'lifestyle technology' mattered little to them. These
concepts were merely the means to an end of developing and selling chips (and
related technology of sister companies such as ATML and SJ Research).

However, there were major differences between each of these boxes in the minds of
those who were most intimately familiar with the technology, and were required to
characterise it and get it working. The designers and developers had a job which was
to create a vehicle which would match particular visions of the interactive user and
audience. Each 'box' - PC, STB or NC - represented a different system concept of
delivering on-line information to the home. Perhaps at the opposite end each box
were very different in marketing terms, as each system represented a new promise of
the elusive advanced media mass market, the perceived market in which the firm
imagined people desiring, acquiring and using their product.

Technological change or innovation today often occurs as projects such as the


Cambridge Trial - events happening within the other flows of normal business that
companies conduct. Many of these projects are complex in organisational character
involving various actors from many different organisations. Some projects may 'spin-
out' becoming separate firms or as operating divisions. Some, as was the case here,
derived from opportunity. But even so, there are still observable trends in the general

467
Ellison had visited Om and laid out plans for them to do the reference design for the NC.

387
market, as well as indications of the state-of-the-art of what technologies and
services can presently offer. The wiser firms keep abreast of these, and link properly
the chances to parse evolving consumer needs with emerging technological
potentials in ever-reflexive loops of innovation activity:
"The different technological trajectories and their technological opportunities
do not coexist unrelatedly, but are connected by several influences, devices
and feedbacks. Therefore, a single technology cannot be explained in
isolation but should be understood in a broader framework. Improvement in
one technology can create totally different applications in other technologies
or even new technological opportunities. Accordingly, nearly exhausted
trajectories can be influenced by other innovations and technology fields
which open up new opportunities." (Pyka, 1997: p.208)

Perhaps it is here that the symmetry between 'cultures of use' and 'cultures of
production' are lost or confused. What appears in the marketplace to create the gestalt
of available products is hardly an infinite range. Even where there is express demand
for functionality, as there was on the Cambridge Trial, forces that lie completely on
the supply side of the equation hindered these customers not only what they
anticipated, but indeed what they were promised.

Within the Cambridge Trial creating the 'mortar' which would join the components
together and get them working as an effective whole, was only one part of a more
complex whole of learning, developing and understanding. For instance much of the
content (comprising largely of the games and educational software), and the video
footage were drawn from Acorn's education division and Anglia Television
respectively. Development work was needed on both of these elements such as
'porting' the software to the system – making it work on the new platform and with
the remote control rather than a PC keyboard. The video footage also required
editing and digitalisation. The largest piece of 'mortar' work was the interface
development.

The interface is the site where not only do all the functional aspects of the system's
purpose must converge in relevant, purposeful and useful ways, but must also
interface with the user in a representational and meaningful way. It can be compared
with the learning of a new language - such as interpreting Morse code on the

388
telegraph - the operation and use of the telephone was comparatively easy. It
presented an ease of use and quality of communication in such an acceptable ratio
that it became of utility -'useful' - and, as a result, acquired habits, situations and
conditions of use developed through its integration into the everyday life and affairs
of people (de Sola Pool, 1977). But this is often a task of not only engineering but
also of aesthetics, and of social learning and cognitive sensitivity and understanding.
In addition to the interface, there were further content elements which had to be
developed as further partners joined the content and services group – the principal
services providers PSPs. These included catalogue-style screens depicting goods,
interactive advertisements and the online surveys and questionnaires.

This was an arena that presented real challenge to academic research. Most
predominately in terms of attaining access and trust. Companies, like individuals, are
not pleased to open their souls at a time when they feel threatened or not entirely in
control. From my own perspective the most difficult obstacle was the constantly
shifting 'first point of contact'. Was this with Om, the trial staff or the working group
on user-research? This led to a kind of 'navigation' within the social structure of the
trial, as it unfolded as a social process. Not only that the trial also unfolded as a
technical process, the system, interfaces and technology changed over time. It did not
remain constant. This was a complex sociotechnical phenomenon, with multiple
constituencies that waxed and waned over the course of the project. On a personal
level, there were times when I was empowered and other times when I was powerless
to influence my involvement in the trial.

On many occasions, I found myself having to reiterate my purpose and relationship


to new members of the service nursery (often by proposal, or extended introduction).
Each time this was interpreted differently, depending upon the new member's
commercial orientation with respect to their core business, or their interest in the
trial. During this time I was returning to Edinburgh, where I was developing the
theoretical aspects of the work. This was subject to constant revision, responding to
issues of a social or technical nature as they arose. For instance, the Acorn
consultants desire to implement QFD, or the shift in the user-research group's interest

389
towards interpretation of their content material. This drove me to explore the
considerable literature that addresses these areas, and shaped the evolving ideas of
CU (and particularly the promise of user-research in constituency-building).

Beyond the visits to Om documented earlier, I engaged frequently in casual


conversations with those working on different aspects of the project which also
directed me towards specific areas of study. For instance, a member of the Om
marketing team was interested in how the 'video violence' debate influenced
perceptions of i-Tv.468
468
Date: 27 May 1981 0209-EDT (Wednesday)
From: Gary Feldman at CMU-10A
Subject: sociology and psychology of computer use

Permit me to carry the doom-crying one step further. I am curious


whether the increasingly easy access to computers by adolescents will
have any effect, however small, on their social development. Keep in
mind that the social skills necessary for interpersonal relationships
are not taught; they are learned by experience. Adolescence is
probably the most important time period for learning these skills.

There are two directions for a cause-effect relationship. Either


people lacking social skills (shy people, etc.) turn to other
pasttimes, or people who do not devote enough time to human
interactions have difficulty learning social skills. I do not whether
either or both of these alternatives actually occur.

I believe I am justified in asking whether computers will compete with


human interactions as a way of spending time? Will they compete more
effectively than other pasttimes? If so, and if we permit computers
to become as ubiquitous as televisions, will computers have some
effect (either positive or negative) on personal development of future
generations?

I am not trying to be anti-technology. In fact, my hunch is that the


answer to the above questions is either no or only slightly. However,
as an ethical computer scientist, I believe in asking these questions
in advance.

One aid in answering these questions is to get psychological profiles


of people involved with computers (not necessarily demographic data).
A direct psychological survey would be most precise. However, getting
indirect data such as gender, marital status, membership in
fraternities/sororities, etc. would also be useful, if properly
interpreted. The only reason for picking on sexual preference (apart
from the unfounded claims that have been made in these digests) is the
slight correlation between sexual preference and other psychological
factors.

Anyone have other ideas for evaluating the psychology of using


computers? I would certainly like to see some sound research efforts
in this direction, although I don't for a minute believe that the

390
The members of the working group often had multiple roles, tasks and projects they
were working on within their firms. They came to the group not only with general
directives agreed upon with their respective firms and upline managers, but indeed
individual perceptions of the technology and the trial based upon their own existing
knowledge, expertise and viewpoints. They also brought to the meetings something
of their company culture, and again, their own individual interpretation of it. This
shaped impressions of what was, and was not, valuable in the user-research project.
It also dictated what could be expected from the trialists. For instance, the BBC came
from a culture where their public was termed viewers. Nat West and Tesco on the
other hand had customers, NOP had subjects and samples, Om had users - each
viewed members of the 'public' often in quite different ways. This manifested in
different values and feelings in how to approach and deal with trialists.

This knowledge also had to fit in with other preoccupations which members had at
the time. Unlike Om and myself, they were not dedicated to a full time focus on the
trial. This impacted levels of commitment and motivation, which varied within the
group. This also influenced its functioning. What was taken from the meetings fed
back into quite different company structures, and so most likely came to 'mean' quite
different things with respect to developing perspectives of the trial in general, or how
to iterate and innovate their particular service offering. Proposals I made, often
manifested later, sometimes 're-engineered' by others (such as the NOP background
questionnaire being close to my own 'media and leisure activities questionnaire' and
BMP's qualitative 'check list' being similar to my earlier proposals I had passed to
them).

There were clearly issues regarding research methodology that were raised during the
various dialogues regarding implementation. NOP qualitative were quite keen to tape
interviews, but not transcript them. They favoured 'lifting' out comments that seemed
to reflect the aims of the project. Whereas in academia there may be some pre-
occupation with method and rigour, as research is often tested on its methods as

economics of research would permit such efforts. [source:


http://www.cs.rutgers.edu/~cwm/NetStuff/Human-Nets/Volume3.html]

391
much as its results, this may be swayed in private sector social research for the
purposes of result and affect. An academic researcher proposing exacting methods
can appear in such dynamic innovation environments as too slow, pedantic, resource
and time consuming. In industrial settings lust of result demands quick fixes often at
the expense of rigour.

Also, there was a privileging of quantitative over qualitative information. I discussed


this bias with respect to scientific investigation. The reasons for why quantitative
investigation was favoured here include the research routines of NOP, and the
seduction of automatic production of use data by system-logging. The promise of
'automatic' understanding of use and usage is extremely attractive. 469 Marcus Penny
viewed that i-Tv opened the potential for instant feedback from users. They would
provide a marketing or product/service development department with the opportunity
to test ideas out on user-consumers, and the feedback would dictate the adoption of
the new product, service or process:

". . . most businesses are producer businesses somebody sits there in a room
celebrating creating something and there is a very, very long chain down to
pushing it out, and the feedback back from users back to here is very, very
imperfect . . . an individual programme producer can create something test it
out and get some instant feedback . . . what will that do for the nature of
television?"

But in a system, constituency, or network which is highly dynamic, constantly


reactive, and ever changing I contend that there would be little opportunity for things
to remain stable enough to make proper inferences or have the time and space to
formulate and ask relevant questions. But the services manager viewed that this was
symptomatic on much wider cultural change - "we're coming to be in a reflexive
world . . . what happens is that you run the reflexivity and I think it gets to the point
of stability emerges its a question of managing through to that."

Marcus Penny stressed the definite need for feedback, a symbiosis of developing

469
A primary business rationale behind the QUBE trial was to use interactive cable for audience
research. It allowed cable operators to monitor what channel people were watching at the time.
(Davidge, 1987) Such ‘automatic’ registering of behaviours lie at the core of ‘post-fordist’ methods of
understanding consumer behaviour.

392
services and content with inputs derived from the user-consumer's tastes, interaction
styles and choices. He considered one of most important elements of reorientation to
this new way of doing business and producing media is that you will simply not
survive unless you take intimate account of the feedback. Business must ultimately
run on the basis of interaction and feedback.

However, Penny saw the notion of interaction and feedback in much more global
terms with respect to the Cambridge Trial and i-Tv. He viewed such an approach as
developing an entire generation beyond this. Digital media permits entire business
processes to become interactive – manufacturing to retailing and customer-care;
"given that the whole process is interactive . . . we're actually building in quality . . .
its an inherent process . . . and you don't need to bring it in from outside as a separate
process." It is the application of re-engineering to a system that is inherently un-
reengineerable. It is inherently non-linear. Such a view bears relation with the non-
linear theories of innovation, and R&D (such as when Fleck, 1988; speaks of
innofusion – innovation through diffusion).

Opposed to simplistic models of linear innovation, this non-linear views recognises


and draw attentions to the way in which feedback loops can occur at all stages of the
innovation-diffusion continuum. Returning to the theme of order rising from chaos,
The Services Manager viewed that standards formation arises from such crises, 'if
you believe in the approach natural standards emerge out of a dynamic process and
are stable because the system keeps them in place'. He sees that crucial to the role of
the 'new manager' (one which is in keeping with the new style of organisation) will
manage processes of crises and chaos:
" . . . its something which you cannot plan and direct in the way that your
used to it in a mechanical view of the world nevertheless there are structures
and if you understand the behaviour particularly in the moving from stability
to another . . . you can encourage that process . . . if you understand what lies
behind that stability you can encourage or interact with it . . . but you've
actually got to observe quite closely what's happening."

However, much like the vision for the self-governing service nursery, it seems that
this view remained somewhat utopian in its faith in the system as it stood, and in the

393
users to act as 'intelligent' parts of functions of the system. It is clear that from the
above sample of users that there was considerable variation in trialist's reactions and
attitudes towards the system. Many of these would stand to confound inferences
beyond the fact that there was little use of the system due to lack of content.

Such research should also be contingent on identifying exactly what the knowledge
objectives of each company are, rather than trying to muddle through with a jointly
arrived at compromise which suits neither the collective nor individual interests and
needs of the group. This appeared the case here. Whether this was down to purely
poor management and co-ordination, or lack of communication flow, or simply due
to some companies not having an exact idea of what they wished to derive from the
research. I would argue that it was also due to the social aspects of the trial being
much less tangible as a discursive practice than discussion of technology. A working
technical system did exist in the Cambridge Trial, however there was no real product
emerging from the user research, and little product emerging in terms of satisfactory
content.

It is obvious that there was some conflict between myself and the self-appointed
chairman and co-ordinator of the group - Seth Paladopicous . This may have
stemmed from an over ambitious initial presentation at a time when the group was
only finding their feet within this new venture. It may also have stemmed from
NOP's obvious wish to be considered 'the' experts in social and market research
whilst perhaps realising their own lack of expertise in coping with the particular
characteristics of interactive media. Their expertise was in mass media (and most
significantly large-scale political polling). In addition, Paladopicous had explicitly
raised the point that I was a 'freeloader' in that I did not represent an organisation
which had contributed some £50,000 pounds in order to 'learn' from the trial. I was
merely offering my services free of charge, and was interested and willing to invest
my time and the resources of the ESRC (in funding my travel and accommodation
while doing field research) in order to interview trial participants for my own, and
the group's benefit. This certainly compromised my position. In the end, however, all
data - qualitative and quantitative were offered to me for analysis, regardless of my

394
exclusion from the group.

The Om case outlined in the previous two chapters illustrates something of the
dilemma which is encountered when trying to elucidate whether an innovation is
technology-push or market-pull. While few commentators would argue that they
exist in a pure form, the Om case suggests the realties of innovation - that chance and
opportunity plays a large part in such processes. Who the end-consumer-users are
remains non-distinct.
"The market demand may come from private firms, from government, or
from domestic consumers, but in its absence, however good the flow of
inventions, they cannot be converted into innovations . . . Some scientists
have stressed very strongly the element of original research and invention and
have tended to neglect or belittle the market" (Freeman, 1982: p.109)

Returning to Woolgar's (1991) 'technology as text' thesis, this concept applies


equally to those who become involved with the processes of design and production,
as much as those who finally use and consume such products in domestic spaces. It is
comprised of all those who are involved in its propagation, production, and use.
Arguably, however, the working group on user research seemed very distant
within from the practice of technology development.

Predominant in the Cambridge Trial were the technology partners and the PSPs.
They were users of the system, and they applied their own particular interpretation of
what it would and could do for their businesses. Which as already maintained
entailed different levels of motivation and commitment to make it work. This is
perhaps where Marcus Penny's "common interest that it [the service nursery] should
exist" indicates a certain presumption in that everyone would magically bond
through shared visions of the system and service future. However, one aspect of the
trial which was shared mutually was an interest in how the public, the ultimate target
market for the product, would react. The PSPs for instance had no real interest in the
technology, they wished to evaluate the content and service potentials of the system.

The purposes of these trials have included technology testing, market positioning and
application and content development. However, it is also worth mentioning that trials

395
have been more often announced than run, and more often run than rigorously
evaluated. No trials have been run to perfect consumer-user analysis. Nor have
they been run to develop research approaches towards interactive media. They
have been run to test new business potentials. However, the promise of user feedback
is crucial in the negotiations enrolling support both within the firm and from
potential partners (Nicoll, 1999). A working 'image' of the technology, and of the
users is the rhetorical tools that guide development. Both were 'texts' and purely
elements within the discourse of these meetings.

Lessons for firms

Several key points may be summarised from this study:

 Trials have distinctive social and technical elements. And these are
distinctive within their own categories – i.e. use of one technology may vary
from that of another in subtle and obvious ways. There is a real danger in
melding human and non-human actors in analysis. It blurs their unique
properties, which may be useful for academic analysis, but may compound
an already endemic industry view which has it that users are already only an
intelligent part of the technical system. This is whether they are planned,
anticipated, scenic, or actual and real. They are basically viewed as elements
which respond, or will respond, and that provide, or will provide data useful
to consolidate business plans and goals.

 Retrospection, while an essential starting point, should include a direct


appraisal of the usefulness of the technology. If designers' can use the
technology and services with their families and friends, without responding
emotionally to any criticism they may level, then they are half-way to
creating a good product. Warning signs are when staff do not want to live
with their own product.

 Informal meetings with users, consumers and trialists can provide valuable
data, as can more formal or technical means of research, such as usability,
online questionnaires etc.

 Working with consumer-users or trialists, as well as partners in product and


service development can operate to everyone's benefit. Getting people
involved, even if one has to play the 'education card' – i.e. wiring up the local
schools the favourite strategy of computer firms - can breed new uses.

 Appoint a member of staff to articulate and co-ordinate knowledge flows,


both internally, externally with partners and with consumer-users and
trialists. Be ecological with communication, reduce noise, and make sure

396
that the right people get the right information. Such a person should act as
an exchange of knowledge and work towards developing the kind of trust
that is integral to such a role.

 Understand and consolidate organisational structures (and knowledge flows)


through feeding back structures to those involved, and eliciting them to
comment (again ecological communication rules should apply). There should
be active effort to minimise presumption but to accent, creativity and
different views.

 Staff, and partners are valuable assets but so are consumer-users and
trialists. It is not bad to consider blurring the distinction between social
actors, and to bring 'common sense' to heed in highly innovative projects.
Everyone in such a network or constituency can help ground ideas into
worthwhile, useful services, which provide good experiences in use, and
encourage frequent and long lasting usage.

Further work, issues, and ideas

The extent of Acorn and Online Media’s efforts and travails illustrate the problems
of gaining recognition and authority in a fluid business environment characterised
by confusion, imperfect knowledge and intelligence, friction among competing
parties. The manufacturing and marketing of not just a new product, but a new
technology and service system within a new management movement within a
changing, contested industrial terrain posed special difficulties and necessitated bold
tactics, especially as Acorn Online Media were, essentially, a small computer
company striving to retain financial independence in a milieu increasingly dedicated
to economies of scale. The firms’s shifting tactics, their continual realignment of
their STB technology and techniques in relation to their sense of the state of the
internet and e-commerce, and their striving to build an identity unique among
competitors such as British Telecom (BT) manifest the ways in which they shaped
their product and themselves along political, rather than strictly scientific-
technological lines. Accordingly, their experiences argue well for the integration of
micro-political analysis into scientific-technological history.

Hunt (1994) argued that although we know a lot about how companies compete in
the market place we know little about how they collaborate. In this study I have only
scratched the surface of what is perceived to be an area of outstanding importance for

397
the further development of new media organisation in the future. The Cambridge
Trial, its managers, designers and participants offered a rich environment for
exploring the multiple dimensions and reality building in the process of design and
management of a new media system. Further, it provided an opportunity to consider
experiential approaches in the technological and marketing evaluation of what may
be considered a radical or discontinuous innovation - from an organisational
perspective as well as the perspective of use. The full nature of the domestication
process with respect to the case of the trial nevertheless remained elusive. Any
'symmetrical model' of design/use, producer/consumer was in effect unrealised in
this case. Certainly, there were most definitely concepts of domestication anticipated
in the design of the system. However, there was little to evidence Silverstone and
Haddon's (1996: p.46) advocacy that; "design completed in domestication." This can
only happen when technologies are successful, when they fulfil designer-producer
and consumer-user expectations. The system, and in particular the content aspects of
the system, never matured or was never developed to the extent necessary to
constitute the naturalised use process of domestication. Instead, it remained an
artefact, an anomaly within the house, transparent not in use, but though lack of use,
usage and usefulness. Lack of content options led to lack of participation, and
therefore the technology can be considered only evocatively - capable of providing a
base by which trial participants were able to comment and project upon 'if it did
work' or did fulfil all it was built up to provide.

CU as an applied research approach may benefit by serving as an index of what can


be ecologically studied at various stages of technological development. The scope
and scale of consumer-user involvement in the technological and marketing
development of programmes may be viewed as a co-evolutionary and co-
developmental process. The cultural distance of designers-producers to consumer-
users may be reduced leading towards the grail of ever-more useful and usable
products. What will be interesting will investigations into other, different product
groups and categories - perhaps where context plays a larger place to the use process
and the domestication of products.470
470
Such an opportunity currently exists at the time of writing with the author’s involvement with a
Design Council project: Information-intensive products. This is concerned with the application of CU

398
This brings to bear a number of issues regarding people's ability to anticipate and
imagine functionalities and modes and conditions of use. Will such projections suffer
from the experienced by marketers in the application of 'soft' research approaches
such as product concept testing? What is also clear from the empirical work in the
organisation of the trial is that it is important to form a more explicit understanding
of what partner organisations require from consumer-user research. This suggests
that the individual way the consumer-user is expressed within their institutional
perspective should be accounted for, as should their motivations for eliciting
consumer-user information and how it will be applied or inform strategies.

There seems to be an implicit and perhaps somewhat restrictive boundary in place


between the current design, consumption and media literature. Most articles within
the design literature which look at interpretivist methods tend to focus very much on
the application of such ideas to the design process itself - contextual inquiry is an
example of this. On the other hand, when understanding users is the focus of
interpretivist interests, HCI tends to be the forum for discussion, but a forum which
seems to have a less sophisticated understanding of interpretivist approaches, perhaps
as a result of a more highly focused area of interest (c.f. discussion of the implicitly
extreme stance of ethnographic approaches, above). Suchman's (1995) article is very
much concerned with modelling users' activities.

The essence of this book has placed a strong emphasis on co-evolving systems of
design, producing, meaning making, myth-making and use. It has stressed the
importance of dynamic conceptions of context upon a reality that pivots around the
process of use, what further can be expected from a semantic model of design,
consumption, use and domestication? Further work will consider how this may be
mapped in such a way as to constitute guidelines of 'best practice' for designers and
those who are stakeholders in the design and implementations of technology and
marketing trials.

as a framework investigating a range of so-called ‘smart products’ - including that range of products
which contain chips supposed to raise the ‘intelligence’ of everyday products and objects.

399
References

400
Abrahamson, E. (1996). Management fashion. Academy of Management Review,
21/1: pp.254-285.
Agar, M.H. (1980) The Professional Stranger. New York: Academic Press.
Alavi, M, and E. Joachimsthaler (1992), "Revisiting DSS Implementation Research:
A Meta-Analysis of the Literature and Suggestions for Researchers," MIS Quarterly
(16)1, pp. 95-116.
Andersen, E. S. & Lundwall, B-Å (1988) 'Small national systems of innovation
facing technological revolutions: an analytical framework', in C Freeman & B-Å
Lundwall, eds: Small Countries Facing the Technological Rrevolution, London:
Pinter, p. 9-36.
Ang, I. (1991) Desperately Seeking the Audience London: Routledge
Araya, A.A. (1995) 'Questioning Ubiquitous Computing' Communications of the
ACM pp.230-237.

Arendt, H. (1958) The Human Condition London: University of Chicago Press

Argyris, C., & Schon, D. (1978). Organizational learning: A theory of action


perspective. Reading, MA: Addison-Wesley

Argyle, M. (1992) The Social Psychology of Everyday Life London: Routledge

Argyle, M. (1996) The Social Psychology of Leisure Harmondsworth: Penguin

Arnold, S. J. and Fischer, E. (1994), "Hermeneutics and Consumer Research,"


Journal of Consumer Research, 21 pp.55-70.

401
Arrow, K.J (1962) 'The economic implications of learning by doing' Review of
Economic Studies 29 ; pp.155-73.

Atkinson, P. (1983) 'Writing Ethnography' in J. J. Helle (editor) Kultur und


Institution. Berlin: Duncker und Humblot.

Ashcraft, D. and Slattery, L.S. (1996) Experiential Design Strategy and Market
Share Design Management Journal, Vol. 7, No. 4, Fall 1996

Bailey J and Pearson S, (1983) Development of a tool for measuring and analysing
computer user satisfaction. Management Science, 29, pp.530-545.

Bannon, L.J. (1991) 'From human factors to human actors: The role of psychology
and human-computer interaction studies in systems design in Greenbaum, J. & Kyng,
M. (Eds.) Design at work.: Cooperative Design of Computer Systems. Hillsdale:
Lawrence Erlbaum Associates, pp. 25-44.
Barker, R.G. (1968) Ecological Psychology. Concepts and Methods for Studying the
Environment of Human Behavior Stanford: Stanford University Press.
Barker, M. & Petley, J. (eds) (1997), Ill Effects: The Media/Violence Debate,
London: Routledge.

Barker, P. (1994). 'Designing Interactive Learning' in T. de Jong & L. Sarti (Eds),


Design and Production of Multimedia and Simulation-based Learning Material.
Dordrecht: Kluwer Academic Publishers.
Barlett, F.D., (1932) Remembering Cambridge: Cambridge University Press.
Baroudi, J.J., Olson, M., and Ives, B. (1986) "An Empirical Study of the Impact of
User Involvement on System Usage and Information Satisfaction," Communications
of the ACM (29:3), March , pp. 232-238.
Barthes, R. (1972) Mythologies London: Vintage
Barthes, R. (1974) S/Z. (Richard Miller, Trans). New York: Hill and Wang
Bateson, G. (1980) Mind and Nature; A Necessary Unity Glasgow: Fontana
Baudrillard, J. (1968) The System of Objects, reprinted in Design After Modernism,
ed. John Thackara, London, 1988, pp.178-179.
Baudrillard, J. (1983) Simulations. New York: Semiotexte(e).
Baudrillard, J. (1988) 'Consumer society' in Mark Poster, ed., Jean Baudrillard:
Selected Writings Cambridge: Cambridge University Press.

Belk, R. W. 1975. "Structural Variables and Consumer Behavior." Journal of


Consumer Research 2 (December) pp.157-164.

Benedetti, P. and DeHart, N. (1996) Forward Through the Rearview Mirror:


Reflections On and By Marshall McLuhan Scarborough, Ontario: Prentice-Hall

402
Canada Inc.,

Bennett, A. and George, A.L. (1997) 'Developing and using typological theories in
Case study research' Paper presented at the 38th Annual Convention of the
International Studies Association in Toronto, March 18-22.

Bently, R., Rodden, T., Sawyer, P. and Sommerville, I. (1992) 'An Architecture for
Tailoring Cooperative Multi-User Displays' Proc. ACM Conference on Computer-
Supported Cooperative Work, Toronto pp.187-194.

Berger, P. L., & Luckman, T. (1966). The social construction of reality. Garden City,
NY: Doubleday.

Berkowitz, S. D. (1982). An introduction to structural analysis: The network


approach to social research. Toronto: Butterworth.

Betti. E. (1980). Hermeneutics as the general methodology of the human sciences. in


Bleicher, J. (Ed.), Contemporary hermeneutics: Hermeneutics as method, philosophy
and critique London: Routledge & Kegan Paul. pp.51-94.

Bijker, W. & Law, J. (1992) Shaping technology/building society: Studies in


sociotechnical change. Cambridge, MA: MIT Press.

Bjørn-Andersen, N. (1988) Are `Human Factors' Human? The Computer Journal


31(5): pp.386-390

Black, J. and Bryant, J. (1995) Introduction to Communication London: Brown and


Benchmark

Bloor, D. (1976) Knowledge and Social Imagery :London: Routledge and Kegan
Paul

Bleicher, J. (1980) (Ed.), Contemporary hermeneutics: Hermeneutics as method,


philosophy and critique London: Routledge & Kegan Paul

Boring, E. G. (1950). A History of Experimental Psychology (2nd ed.). New York:


Appleton-Century-Crofts

Bott, E. (1957) Family and Social Network London: Tavistock

Bounds, G., Yorks, L, Adams, M. and Ranney, G. (1994) Beyond Total Quality
Management: Toward the Emerging Paradigm London: McGraw-Hill

Brach, M.A. (1991) 'Contextual knowledge and the diffusion of the technology in
construction' unpublished Ph.D. book MIT

Brand, S. (1994) How Buildings Learn: What Happens After They're Built London:

403
Viking

Branscomb, A.W. (1997) Wired 5.9 Sept. 1997 p.113.


Bricken, W. (1990) 'Cyberspace, 1999: The Shell, The Image and Now The Meat.'
Mondo 2000 (2). p 56–59
Brookes, B.C. (1981) 'The foundations of information science.' Journal of
Information Science 2 pp. 3-12.

Brown, J.S. , Collins, A. & Duguid, S. (1989) 'Situated cognition and the culture of
learning.' Educational Researcher. 18 (1): pp.32-42.

Brown, J.S. and Duguid, P. (2000) The Social Life of Information Boston: Harvard
Business School Press
Bruner, J. S. (1986). Actual minds, possible worlds. Cambridge, MA: Harvard
University Press.
Bucciarelli, L., (1994) Designing Engineers. Cambridge, MA: MIT Press
Buchanan, R., Doordan, D., and Margolin, V. (Eds) (1996) 'Introduction' Design
Issues, Volume 12, Number 3 (Autumn 1996).
Buckelew, D.P. and Penniman, W.D. (1974) 'The outlook for interactive television'
Datamation August pp.54-58.
Bullen, C. & Bennett, J. (1990) Learning from user experience with groupware. In
Proceedings, CSCW '90, October, Los Angeles, CA, pp. 291-302.
Buller, D.J. (1999) Function, Selection, and Design. Albany, NY: SUNY Press,
Series in Philosophy and Biology
Burke, D. (2000) SpyTV: Just Who is the TV Revolution Overthrowing? Hove:
Slab-o-Concrete Press
Burch, W. R., Jr., and DeLuca., D. (1984) Measuring the social impact of natural
resource policies. University of New Mexico Press, Albuquerque
Busch, A. (1999) Geography of Home : Writings About Where We Live Princeton:
Princeton Architectural Press
Callon, M. (1980). 'Struggles and negotiations to define what is problematic and
what is not. The socio-logic of translation.' In Knorr-Cetina, K.D., Krohn, R. and
Whitley, R.D. (eds), The Social Process of Scientific Investigation: Sociology of the
Sciences Yearbook Dordrecht: Reidel
Callon, M. (Ed.). (1998). The Laws of the Markets. Oxford, Blackwell and the
Sociological Review
Callon, M. (1986). 'Some elements of a sociology of translation; domestication of the
scallops and the fishermen of St Brieuc Bay.' In Law J. (ed.), Power, Action and
Belief. A New Sociology of Knowledge? Routledge and Kegan Paul, London.
Callon, M. and Latour, B. (1981). 'Unscrewing the big Leviathan: how actors
macrostructure reality and how sociologists help them to do so.' In Knorr-Cetina,
K.D. and Cicourel A.V. (eds), Advances in Social Theory and Methodology: Toward

404
and Integration of Micro- and Macro- Sociologies, Boston, Mass: Routledge and
Kegan Paul. pp. 277-303.

Callon, M., Law, J. and Rip, A. (1988) Mapping the dynamics of science and
technology: Sociology of science in the real world. London: Macmillan
Capra, F. (1996) The Web of Life. Harper Collins, London.
Carey, J. (1996) 'Critical Analysis of Interactive Television Developments in the Last
Three Decades' Presentation at i-Tv'96: The Superhighway Through the Home? The
University of Edinburgh, 3rd, 4th, 5th September 1996.
Carey, J. and O'Hara, P. (1995) 'Interactive Television' in (d'Agostino, P. and Taffler,
D. eds.) Transmission: Towards a Post-Television Culture (2nd. Ed) London: Sage
Carroll, J. M., (1994) 'Making use a design representation' Communications of the
ACM 37, Number 12, pp. 29-35.
Carroll, J. M., (1997) 'Human-computer interaction: psychology as a science of
design' International Journal of Human-Computer Studies 46, pp.501-522.
Carroll, J. M. & Rosson, M. (1992). 'Getting around the task-artifact cycle: how to
make claims and design by scenario.' ACM Transactions of Information Systems, 10,
pp.181-212.
Castells, M. (1996) The rise of the network society Oxford: Blackwell.
Chabaud-Rychter, D. (1994) 'The construction of women users in the design process
of a hourshold appliance' Domestic technology and everyday Life Proceedings from
COST A4 workshop in Trondheim, Norway Oct. 28-30 1993 pp.83-95.
Charmaz, K. (1995). 'Between positivism and postmodernism: Implications for
methods.' Studies in Symbolic Interaction, 17, pp.43-72.
Cherry, C. (1977) 'The telephone system: Creator of mobility and social change' in
Pool, I de sola, (Ed.) The Social Impact of the Telephone London: MIT pp.112-126.
Cicourel, A.V. (1964) Method and Measurement in Sociology. New York: Free Press
Clark, G. (1992) Space, Time and Man: A Prehistorian's View Cambridge:
Cambridge University Press
Clifford, J. (1986). "Partial Truths", in: Clifford, J. and Marcus, G.E. (eds.): Writing
Culture. The Poetics and Politics of Ethnography. Berkeley: University of California
Press, pp.1-29.
Clipson, C. (1991) 'Innovation by design' in Henry, J and Walkers, D. (1991)
Managing Innovation London: Sage
Cole, Michael (1991). 'Conclusion' In Resnick, L.B., Levine, J.M and Teasley, S.D.
(eds.) Perspectives on Socially Shared Cognition. Washington, DC: American
Psychological Association. pp. 398-417.
Cove, B. and Svanfeldt, C. (1993) 'Societal innovation and the postmodern
aestheticization of everyday life.' International Journal of Marketing No.10 pp. 297-
310.

405
Cohen, L. (1995). Quality Function Deployment: How to Make QFD Work For You,
Reading, Mass: Addison-Wesley
Cohen, W. and Levinthal, D. (1990) Absorptive capacity: A new perspective on
learning and innovation, Administrative Science Quarterly. vol. 35, pp.128-152.
Cooper, G and Woolgar, S (1994) 'A sociological study of changes in research
culture: a project report' CRICT discussion paper: Brunel University
Cooper, R. and Press, M. (1995) The Design Agenda Chichester: John Wiley and
Sons
Collican, C. (1990) Research Methods and Statistics in Psychology London: Hodder
and Stoughton
Cova, B. and Svanfeldt, C (1993), 'Societal innovations and the postmodern
aesthetization of everyday life' International Journal of Research in Marketing,
10, pp.297-310.

Cowan, R. S. (1983) More work for mother: The ironies of household technology
from the open hearth to the microwave. New York: Basic Books.
Craven, P. and Wellman, B. (1973) 'The Networked City' Centre for Urban and
Community Studies and Dept. of Sociology, Univ. or Toronto, Research Paper
No.59
Critchlow, K. (1981) Rear cover note to William Stirling's (1897/1981) The Canon:
An Exposition of the Pagan Mystery Perpetuated in the Cabala as the Rule of all the
Arts London: Research into Lost Knowledge Organisation
Curran, J. (1990) 'The New Revisionism in Mass Communication Research: A
Reappraisal', European Journal of Communication 5: pp.130-64.
Csikzentmilhalyi, M. and Rochberg-Halton, E. (1981) The Meaning of Things:
Domestic Symbols and the Self Cambridge: Cambridge University Press
David, P., (1975) Technical choice, innovation, and economic growth. London:
Cambridge University Press.
Davidge, Carol (1987), 'America's talk- back television experiment: QUBE', in
William Dutton, Jay Blumer and Kenneth Kraemer (eds.) Wired Cities: Shaping the
Future of Communications. Boston: G.K. Hall, pp. 75- 101.
Davies, P. (1987) "The Creative Cosmos" New Scientist, 17th. Dec, pp 41-44.
Davis, F.D., (1986)"A Technology Acceptance Model for Empirically Testing New
End-User Information Systems: Theory and Results," Doctoral Dissertation, Sloan
School of Management, Massachusetts Institute of Technology
Davis, F.D., (1989) "Perceived Usefulness, Perceived Ease of Use, and End User
Acceptance of Information Technology," MIS Quarterly, 13, pp.318-339.
Davis, F.D. (1993) "User Acceptance of Information Technology: System
Characteristics, User Perceptions and Behavioral Impacts," International Journal of
Man-Machine Studies (38), pp. 475-487.
Davis, F.D., Bagozzi, R.P., and Warshaw, P.R., (1989) "User Acceptance of

406
Computer Technology: A Comparison of Two Theoretical Models," Management
Science, 35:8 pp. 982-1003.
Davis, G. B. and Olson, M. H., (1984) Management Information Systems:
Conceptual Foundations, Structure and Development, 2nd Ed, London: McGraw-
Hill

Dawkins, R. (1986) The Blind Watchmaker, Harlow: Longman


De Bont; C.J.P.M., (1992) Consumer Evaluations of Early Product-Concepts The
Netherlands: Delft University Press
Deetz, S.A. & Kersten, A. (1983). Critical models of interpretative research. In
Putnam, L.L. & Pacanowsky, M.E. (Eds.), Communication and organizations: An
interpretative approach Beverly Hills: Sage. pp.147-171.
DeFleur, M. L. and Ball-Rokeach, S. (1989) Theories of Mass Communication (5th
ed.) New York: Longman
De Landa, M. (1997) A Thousand Years of Nonlinear History New York: Zone
Books
Dennett, D. C., (1987) 'The Logical Geography of Computational Approaches: A
View from the East Pole' in M. Harnish and M. Brand, eds., Problems in the
Representation of Knowledge Tucson, AZ: University of Arizona Pres. pp. 59-79.

Denning P. J. and Metcalfe R. M. (1997) (eds.) Beyond Calculation: The Next Fifty
Years of Computer New York: Copernicus
Denzin, N. (1997) Interpretive Ethnography: Ethnographic Practices for the 21st
Century. London: Sage.
De Sola Pool, I. (ed.) (1977) The Social Impact of the Telephone Cambridge, MA:
MIT Press.

Dickerson, M.D., and Gentry, J.W. (1983) "Characteristics of Adopters and Non-
Adopters of Home Computers." Journal of Consumer Research 10: pp.225–235.

Diesing, Paul. (1971) Patterns of Discovery in the Social Sciences. Chicago: Aldine-
Atherton.

Dilts, M. M. (1941) The Telephone in a Changing World. New York: Longmans,


Green & Co.

Doheny-Farina, A. (1992) Rhetoric, Innovation, Technology : Case Studies of


Technical Communication in Technology Transfers Cambridge, MA : MIT Press

Donald, M. (1991). Origins of the Modern Mind: Three Stages in the Evolution of
Culture and Cognition. Cambridge, MA: Harvard University Press.
Dosi, G. (1982) 'Technological paradigms and technological trajectories – a
suggested interpretation of the determinants and directions of technological change'
Research Policy 11, pp.147-162.

407
Dosi, Giovanni, (1988), 'Sources, Procedures, and Microeconomic Effects of
Innovation', Journal of Economic Literature, Vol. XXVI, September.
Dovey, K. (1978) 'Home: An ordering principle in space' Landscape 22(2) pp.22-36
Dreyfus, H. L. (1967) 'Why Computers Must Have Bodies in order to be
Intelligent.' Review of Metaphysics, 21, pp.13-32.
Drucker, Peter, (1957) Landmarks of Tomorrow N.Y. & Evanston: Harper and Row,
N.Y. & Evanston

Drucker, P. (1999) 'Beyond the Information Revolution' - The Atlantic Monthly;


October Vol. 284, No. 4; pp. 47-57.

Du Gay, P. (1996) 'Introduction' in du Gay, P. Hall, S. Janes, L. Mackay, H. and


Negus, K. Doing Cultural Studies: The Story of the Sony Walkman London:
Sage/Open University, pp.1-5.
Dumas, J. and Redish, J. (1994) A Practical Guide to Usability Testing Norwood,
NJ: Academic Press
Eco, Umberto (1976): A Theory of Semiotics. Bloomington: Indiana University
Press/London: Macmillan
Ehn, P., L, J. (1997). 'Design for quality-in-use:Human-computer interaction meets
information systems development.' I Helander, M., Landauer, T., Prabhu, P. (eds)
Handbook of human-computer interaction. 2nd.edition,. Amsterdam: Elsevier, pp.
299-313.
Eisner. R., (1991) 'Researchers see a wealth of applications for virtual reality,'
Scientist. 5. pp.14-16.
Elliot, P. (1974). 'Uses and gratifications research: A critique and sociological
alternative.' In Blumler J. G. & Katz E. (Eds.) The uses of mass communications:
Current perspectives on gratifications research Beverly Hills, CA: Sage. pp. 249-
268.
Elster, J., (1994) 'Functional Explanation: In Social Science' In Martin and McIntyre
(eds.), Readings in the Philosophy of the Social Sciences. Cambridge, MA: MIT
Press pp. 403-414.
Engal, J., Blackwell, R. and Miniard, P. (1990) Consumer Behaviour New York:
The Dryden Press
Erdem, T. and Swait, J. D. (1998) 'Brand equity as a signaling phenomenon'. Journal
of Consumer Psychology 7, pp.131-157.
Erlandson, D.A., Harris, E.L., Skipper, B.L., & Allen, S.E. (1993). Doing
naturalistic inquiry: A guide to methods. London: Sage Publications
Feenberg, Andrew (1991). Critical Theory of Technology. New York: Oxford Univ.
Press.
Firat, A. F., & Venkatesh, A. (1995). Liberatory postmodernism and the
reenchantment of consumption. Journal of Consumer Research, 22(3), 239-267.
Fish, S. (1979) 'Normal circumstances, literal language, direct speech acts, the

408
ordinary, the everyday, the obvious, what goes without saying, and other special
cases in Rabinow, P. and Sullivan, W. (eds) Interpretative Social Sciences Berkeley:
Univ. of California Press pp.243-265
Fish, S. (1980) Is There a Text in This Class? The Authority Of Interpretive
Communities. Cambridge, MA: Harvard University Press
Fiske, J. (1989) Reading the Popular London: Unwin Hyman

Fiske, J. & Hartley, J. (1978): Reading Television. London: Methuen


Fleck, J. (1988) 'Innofusion or Diffusation? The Nature of technological
Development in Robots' Edinburgh PICT Working Paper No.4 The University of
Edinburgh
Fleck, J. (1993) 'Configurations: Crystallizing Contingency' The International
Journal of Human Factors in Manufacturing, Vol. 3. London: John Wiley & Sons
pp.15-36.
Fleck, J. (1994) 'Learning by trying: the implementation of configurational
technology' Research Policy 23, Issue 6 pp.637-653.
Fleck, J. (1984) 'Artificial Intelligence and Industrial Robots: An Automatic end for
Utopian Thought?' in (Mendelsohn, E. and Nowotny, H. eds.) Nineteen Eighty-Four:
Science between Utopia and Dystopia. Sociology of the Sciences, Vol. VIII. pp.189-
231.
Forsythe, D.(1992) 'Using ethnography to build a working system: Rethinking basic
design assumptions.' Proceedings. of SCAMC-92: 16th Ann Symposium of Computer
Applied MedicalCare pp.505-509.
Forty, A. (1986) Objects of Desire: Design and Society 1750-1980 London: Thames
and Hudson
Foxall, G.R. and Goldsmith, R.E., (1994) Consumer Psychology for Marketing
London: Routledge
Freeman, C (1982) The Economics of Industrial Innovation (2nd Ed) London: Pinter

Gadamer, H.G. (1976) Philosophical Hermeneutics Berkely: University of California


Press
Gadamer, H.G. (1979). Truth and Method. London: Sheed & Word
Garner, W.R. (1978) 'Aspects of a stimulus: features, dimensions, and configurations'
In Rosch,E. and Lloyd, B.B. (Eds.) Cognition and Categorization Hillsdale, NJ:
Lawrence Erlbaum Associates

Gasser, L. (1986) 'The integration of computing and routine work' ACM Trans. On
Office Information Systems, 4(3): pp.205–225.
Geertz, C. (1973) The Interpretation of Cultures Basic Books, N.Y.
Geertz, C. (1988). Works and Lives: Anthropologist as Author. Palo Alto, CA:
Stanford University Press.
Georgiou, L., Metcalfe, J.S., Gibbons, M., Ray, T. and Evans, J. (1986) Post-

409
Innovation Performance: Technological Development and Competition London:
Macmillan
Gergen, K. (1973) 'Social psychology, science and history' Personality and Social
Psychology Bulletin 2, pp.373-383.

Gergen, K. J. (1985). The social constructionist movement in modern psychology.


American Psychologist, 40, pp.266-275

Gibson, J.J. (1977) 'The theory of affordances' In R. Shaw & J. Bransford (eds.)
Perceiving, Acting and Knowing. Hillsdale, NJ: Erlbaum.
Gibson, W., (1984) Neuromancer. New York: Ace Books.
Giddens, A. (1979). Central Problems in Social Theory: Action, Structure and
Contradiction in Social Analysis. London: Macmillan

Giddens, A. (1984). The Constitution of Society Berkeley, CA: University of


California Press.
Giorgi, A. (1976). 'Phenomenology and the Foundations of Psychology.' In:
Proceedings of the Nebraska Symposium on Motivation. University of Nebraska,
Lincoln.
Glasser, B. and Strauss, A. (1967) The Discovery of Grounded Theory Chicago:
Aldine
Glasser, B. (1978) Theoretical Sensitivity Advances in the Methodology of Grounded
Theory Sociology Press: San Fransisco
Godwin, J. (1979) Robert Fludd London: Thames and Hudson

Goffman, E. (1959) The Presentation of Self in Everyday Life. Garden City, New
York :Doubleday

Gold, R.L. (1958) 'Roles in sociological field observation' Social Forces. 36,
pp.217-223.

Goldman, S., Nagel, R., & Preiss, K. (1995). Agile competitors and virtual
organizations: Strategies for enriching the customer. New York: Van Nostrand
Reinhold.

Gonzalez, P. (1997) Anthropology and Design of Technical Systems SRI Consulting


Business Intelligence Program
Good, M. (1989). 'Experience with Contextual Field Research' Proceedings of CHI
'89 Human Factors in Computing Systems, New York: ACM. pp.21-24.
Gould, J.D., & Lewis, C. (1985). 'Designing for usability: Key principles and what
designers think.' Communications of the ACM, 29 (3), 300-311.
Gould, J.D., (1988) 'How to design usable systems.' In M. Helander (Ed.) Handbook
of Human-Computer Interaction. Amsterdam: North-Holland.

410
Grantham, C. E., & Vaske, J. J. (1985). 'Predicting the usage of an advanced
communication technology.' Behaviour and Information Technology, 4 (4), pp.327-
335.
Gray, A. (1987) 'Behind closed doors: women and video' in Baehr, H. and Dyer, G.
(eds) Boxed-in: Women On and In TV London Routledge. pp.38-54.
Gregory, S.A., (ed.)(1965) The Design Method Butterworth: London
Grint, K. & Woolgar, D. (1997) The Machine at Work: technology, work and
organization. London: Polity Press.
Grudin, J. (1986) 'Designing in the Dark: Logics that Compete with the User Design
Methods II' Proceedings of ACM CHI'86 Conference on Human Factors in
Computing Systems pp.281-284.
Grudin, J. (1991) 'Utility and Usability: Research Issues and Development Contexts
I. General Principles, Metacomments' First Moscow International HCI'91 Workshop
Proceedings pp.18-22.
Grudin, J. (1993) 'Interface: An evolving concept' Communications of the ACM, vol.
36 no. 4, pp.110-119.
Guba, E. (1990). 'The alternative paradigm dialog.' In Guba, E. (Ed), The Paradigm
Dialog, Newbury Park, CA: Sage. pp. 17-27.
Halasz, F.G. (1988) 'Reflections on Notecards: Seven Issues for the Next Generation
of Hypermedia Systems' Communications of the ACM (CACM), 31(7), pp. 836-852.
Hall, E. T. Beyond Culture. New York, NY: Anchor/Doubleday
Hall, P. (1996) Cities Of Tomorrow: An Intellectual History Of Urban Planning And
Design In The... London: Blackwell
Hall, Stuart (1980): 'Encoding/Decoding.' in: Hall, S.; Hobson, D.; Lowe, A,; Willis,
P. (Eds.): Culture, Media, Language. London: Hutchinson pp. 128–138.
Halloran, J. D. (Ed.). (1970). The Effects of Television. St. Albans: Panther.
Harré , R. & Secord, P. F. (1972). The Explanation of Social Behavior. Totowa, NJ:
Rowman and Littlefield.
Hartley, J. (1987) 'Invisible fictions: television audiences, paedocracy, pleasure' in
Textual Practice, Vol.1, No.2, pp.121-138.
Hauser, J.R. and Clausing, D (1988) 'The House of Quality' Harvard Business
Review Vol.66, No. 3 (May-June) pp.63-74.

Hauser, J.R. (1993) 'How Puritan-Bennett used the House of Quality' Sloan
Management Review. Vol. 34 No. 3. Spring p. 61-70.

Hauser, J.R. and Griffin. A, (1993) 'The Voice of the Customer', Marketing Science
12 1-25.
Hawkes, T. (1977) Structuralism and Semiotics. London: Routledge
Heath, S. (1990) 'Representing television' in Mellancamp, P. (ed.) Logics of
Television Bloomington: Indiana Univ. Press pp.267-302.

411
Heilbroner, R. (1972) 'Do machines make history?' in Kranzberg, M. et al (eds)
Technology and culture: an anthology. Reprinted in Smith, M.r. and Marx, L. (eds.),
(1994) Does Technology Drive History? The Dilemma of Technological Determinism
Cambridge, MA: MIT Press pp. 53-65.

Hejl, P. (1984) 'Towards a Theory of Social Systems: Self-Organization and Self-


Maintenance, Self-Reference and Syn-Reference' in ULRICH, H., and Gilbert J. B.
PROBST (eds.) Self-Organization and Management of Social Systems: Insights,
Promises, Doubts, and Questions Berlin: Springer-Verlag pp. 60-78.
Heller, F. (1989) 'On Humanising Technology" Applied Psychology 38 (1) pp.15-28.
Henderson, R. M. and Clark, K.B. (1990). 'Architectural innovation: The
reconfiguration of existing product technologies and the failure of established firms.'
Administrative Science Quarterly. Vol. 35, pp. 9-30.
Hendrick, H. W. (1991). 'Ergonomics in Organizational Design and Management'
Ergonomics, 34(6), 743-756.
Hendrick, H. W. (1995). 'Humanizing Re-engineering for True Organization
Effectiveness: A Macroergonomic Approach.' Proceedings of the Human Factors
and Ergonomics Society, 2, 761-765.
Hewer, S (1995) The DAN teaching pack: Incorporating age-related issues into
design courses, London: RSA
Hiltz, S. R. and Johnson, K. (1989). Experiments in group decision making, 3:
Disinhibition, deindividuation, and group process in pen name and real name
computer conferences, Decision Support Systems, 5: pp.217-32.
Hirchman, E.C. (1981) 'Technology and symbolism as sources for the generation of
innovations' Advances in Consumer Research, 9, pp.142-154.
Holbrook, M.B. (1995) Consumer Research: Introspective Essays on the Study of
Consumption London: Sage
Hollenbeck, A. R., & Slaby, R. G. (1979). ,Infant visual and vocal responses to
television., Child Development, 50, pp.41-45.
Holtzblatt, K. and Jones, S., (1992) 'Contextual inquiry: A participatory design
technique for system design.' In Schuler, D. and Namioka, A. (Eds.), Participatory
Design: Principles and Practices, Hillsdale, NJ: Lawrence Erlbaum Associates, 177-
210
Holtzblatt, K.and Beyer, H., (1993). 'Customer-centred design work for teams.'
Communications of the ACM, 36(10) pp.92-103.
Howard, J.A. and Sheth, J.N. (1969) The Theory of Buyer Behaviour New York:
Wiley
Hubona, G.S. and S. Geitz,(1997) "External Variables, Beliefs, Attitudes and
Information Technology Usage Behavior," Proceedings of the Thirtieth Annual
Hawaii International Conference on System Sciences (HICSS-30)
Hughes, T. (1983) Networks of Power London: John Hopkins University Press
Hughes, T. P. (1987) The seamless web: Technology, science, etcetera, etcetera.

412
Social Studies of Science 16, pp.281-292.
Hunt, S. (1994), ' Rethinking Marketing : Our Discipline, Our Practice, Our Methods'
European Journal of Marketing, 28(3),13-25.
Hutchins, E. (1995). Cognition in the Wild. MIT Press.
Heylighen F. (1992): "Principles of Systems and Cybernetics: an evolutionary
perspective", in: Cybernetics and Systems '92, R. Trappl (ed.), (World Science,
Singapore), pp. 3-10.

Iberall, A. (1972) Toward a General Science of Viable Systems New York: MacGraw
Hill

Iberall, A. (1987) 'A Physics for Studies of Civilization. In Self-Organizing Systems:


The Emergence of Order.' Yates, F.E., A. Garfinkel, D.O. Walter, and G.B. Yates,
(eds). London: Plenum Press. pp. 521-542
Imai, K. Nonaka, I., and Takeuchi, H.. (1985) 'Managing the New Product
Development Process: How Japanese Companies Learn and Unlearn.' In K. Clark, R.
Hayes, & C. Lorenz (Eds.) The Uneasy Alliance: Managing the Productivity-
Technology Dilemma Boston: Harvard Business School Press pp. 337-375.
Iuso, W. (1975) 'Concept testing: An appropriate approach' Journal of Marketing
research, 12 pp.228-231.
Ingwersen, P. (1992), Information Retrieval Interaction. London: Taylor Graham.
Ives W, Hamnett P and Lubin D (1993) Making multimedia accessible:
implementing a new paradigm - the automated factory. Multimedia Monitor XI (11),
pp.24-29
Jacob, H.(1997) 'letter' Journal of Design and Design Theory Vol.2; No.1
Jencks, C. (1987) The Language of Post-Modern Architecture, 5th ed. New York:
Rizzoli
Jacobson, D. (1991). Reading Eethnography. Albany: State University of New York
Press.
Johnson, R. (1986) 'The story so far; And further transformation?' in Punter, D. (ed.)
Introduction to Contemporary Cultural Studies London: Longman
Johnson, B. (1990). 'On Writing'. In Frank Lentricchia and Thomas McLaughlin
(eds.), Critical Terms for Literary Study. Chicago: Univ. of Chicago Pr. 39-49.
Jones, J.C. (1991) Designing Designing London: Architecture, Design and
Technology Press
Kanter, R.M. (1992) 'Think like the customer; the global business logic' Harvard
Business Review, 70 (9) p.29.
Katz, D. & Kahn, R.L., (1966) The Social Psychology of Organizations New York:
John Wiley & Sons
Keates, S., & Clarkson P.J. (1999). 'Towards a generic approach for designing for all
users.' Proceedings of RESNA '99, Long Beach, USA, pp.97-99.

413
Kelly, K. (1997) 'New rules for the new economy' Wired, 5.09 , September p.141.
Kelly, K. (1994, May). 'The electronic hive-embrace it.' Harper's, 228(1728):20-25.
Adapted from Out of Control: The Rise of Neo-biological Civilization. Reading, MA:
Addison-Wesley.
Kincaid, H., (1994) 'Assessing Functional Explanations in the Social Sciences.' In
Martin and McIntyre (eds.), Readings in the Philosophy of the Social Sciences.
Cambridge, MA: MIT Press, pp. 415-428.
Kling, R. (1987) 'Defining the Boundaries of Computing Across Complex
Organizations.' in Critical Issues in Information Systems, R. Boland and R.
Hirschheim (eds.).New York: John-Wiley
Knights, D. and Murray, F. (1994) Managers Divided: Organization, Politics and
Information Technology Chichester: John Wiley

Knowles, C., (1988). 'Can Cognitive Complexity Theory (CCT) Produce an


Adequate Measure of System Usability'? In Jones, D. M. &. Winder, R.(eds), People
and Computers IV. Cambridge University Press, Cambridge. pp. 291-307.
Kopytoff, I. and Apadurai, A. (1986) The Social Life of Things: COmmodaties in
Cultural Perspectives
Korzybski, A. (1933). Science and Sanity.Lakeville, CT: Institute of General
Semantics
Krippendorff, K. (1995) 'On the essential contexts of artifacts or on the proposition
that "design is making sense (of things)"' Margolin, V. and Buchanan, R. (eds.) The
Idea of Design Cambridge, MA: MIT pp.156-184.
Kvale, S., (1996) Interviews: An Introduction to Qualitative Research Interviewing
London: Sage
Kwasnik, B. (1992) "The Role of Classification Structures in Reflecting and
Building Theory," Advances in Classification Research, Vol. 3 Medford, NJ: Learned
Information/ASIS, pp. 63-81.
Landauer, T.K. (1991) 'Let's get real: A position paper on the role of cognitive
psychology in the design of humanly useful and usable systems,' in Caroll, J.M. ed.
Designing Interaction: Psychology at the Human-Computer Interface Cambridge:
Cambridge University Press pp.60-73.
Lang, J. (1971) 'Architecture for human behavior; the nature of the problem' in
Burnette, C. Lang, J. and Vachon, D. Architecture for Human Behavior Philadelphia:
Philadelphia Chapter/AIA pp.15-22.
Latour, B. (1987) Science in Action: How to Follow Scientists and Engineers
through Society. Cambridge: Harvard Univ. Press
Latour, B. (1993) We Have Never Been Modern. Transl. Catherine Porter.
Cambridge, MA: Harvard University Press
Latour, B. (1996) Aramis -the Love of Technology Cambridge, MA: Harvard

414
University Press
Lave, J. (1988). Cognition in Practice: Mind, Mathematics, and Culture in Everyday
Life. Cambridge, UK: Cambridge University Press.
Law, J. and Hassard, J. (eds) (1998) Actor Network Theory and After, London:
Blackwell
Leonard-Barton, D., and DeSchamps, I. (1988) 'Managerial Influence in the
Implementation of New Technology.' Management Science 34:10, pp.1252-1265.
Leonard-Barton, D. (1995) Well Springs of Knowledge Boston: Harvard Business
School Press
Leavitt, H.J., Pinfield, L., & Webb, E., (1974) Organizations of the Future:
Interaction with the External Environment New York: Praeger
Lefebvre, H. (1958/1991) The Critique of Everyday Life, (trans. Moore, J.) London:
Verso

Lefebvre, H. (1992) The Production of Space London: Blackwell

Levine, J., & Mullins, N. (1978). Structuralist analysis of data in sociology.


Connections, 7, pp.16-23.
Levi-Strauss, C. (1968). The savage mind. Chicago: University of Chicago Press.
Lindlof, Thomas R. (1995). Qualitative research methods. Thousand Oaks: Sage
Logan, R (1994) 'Behavioral and Emotional Usability: Thomson Consumer
Electronics' in Wiklund, M.E.(ed. 1994) Usability in Practice Boston: AP
Professional pp.pp.59-83.
Litva, P.F. (1995) 'Integrating Customer Requirements into Product Designs' The
Journal of Product Innovation Management 12 No. 1 (January), pp. 3-15.
Luhmann, N. (1984) Social System Frankfurt: Suhrkamp

Lunt, P.K., and Livingstone, S.M. (1992). Mass Consumption and Personal Identity:
everyday economic experience. Buckingham, U.K.: Open University

Lunt, P.K. (1995). 'Psychological approaches to consumption.' In D. Miller (Ed.).


Acknowledging Consumption. London: Routledge. Pp.238-263.
Mackay, H. & Gillespie, G. (1992) `Extending the Social Shaping of Technology
Approach: Ideology and Appropriation', Social Studies of Science, Vol. 22, No. 4
(November), pp. 685-716.
Mackay, H. (1995) 'Theorising the IT/Society relationship' in (Heap, N., Thomas, R.,
Einon, G., Mason, R. and Mackay, H. eds.) Information Technology and Society
London: Sage
MacKenzie, D and Wajcman, J. (1985). The Social Shaping of Technology London:
Sage
Maddix, F. (1990) Human-Computer Interaction: Theory and Practice, Ellis
Horwood Limited.

415
Main, J. (1994). Quality Wars. New York: Free Press.
Malinowski, B., (1944) A Scientific Theory of Culture, and Other Essays. Chapel
Hill: University of North Carolina Press
Malinowski, B. (1922/1978), Argonauts of the western Pacific: an account of native
enterprise and adventure in the archipelagos of Melanesian New Guinea, London:
Routledge and Kegan Paul.
Manross, G. G., & Rice, R. E. (1986). 'Don't hang up: Organizational diffusion of the
intelligent telephone.' Systems, Objectives, Solutions, 10, pp.161-175.
Review Vol.72 No.5 pp.144-149.
Martin, M. (1991): 'Hello Central': Gender Technology and Culture in the formation
of telephone systems. London: McGill Queens University Press.
Marzano, S. (1995) 'Television at the crossroads: The quest for new qualities' in
(Mendini, A., Branzi, A., and Marzano, S. eds) Television at the Crossroads London:
Academy Editions pp.9-11.
Maskery, H. And Meads, J. (1992). 'Context: In the eyes of users and in computer
systems.' SIGCHI Bulletin, 24 (2), pp.12-21.
Maslow, A. H. (1954). Motivation and personality (3rd ed.). New York: Harper &
Row
Maturana, H. and Varela, F. (1987) The Tree of Knowledge London: Shambala
McClelland, I.L. and Brigham, F.R. (1990) 'Marketing ergonomics' Ergonomics
Vol.33 No.5. pp. 519-526.
McCracken, G. (1988) The Long Interview Newbury Park: Sage.
McCracken, G. (1990) Culture and Consumption Bloomington: Indian Univ. Press
McDonald, T. J (ed.) (1996) The Historical Turn in the Human Sciences: Essays on
Transformations in the Disciplines Ann Arbor: University of Michigan Press
McKenna, R. (1995) RealTime: Preparing for the Age of the Never Satisfied
Customer Harvard: HBS Press
McLeod, J., Kosicki, G.M. and Pan, Z. (1991) On understanding and
misunderstanding media effects in Curran, J. and Gurevitch, M. (eds.) Mass Media
and Society London: Edward Arnold pp.235-267.
McLuhan, M. (1960) 'Effects of the improvements of communication media.'
Journal of Economic History. Vol.20 pp.566-575.
McQuail, D. & Windahl, S. (1993) Communication Models for the Study of Mass
Communication. London: Longman
McQuail, D (1994) Mass communication theory: an introduction. Third Edition.
London: Sage.
Mead, G. H. (1932) The Philosophy of the Present La Salle (Illinois): Open Court
Mead, G. H. (1934) Mind, Self, and Society, Chicago: University of Chicago
Mead, G. H. (1936) Movements of Thought in the Nineteenth Century Chicago:

416
University of Chicago
Mead, G. H. (1938) The Philosophy of the Act, Chicago: University of Chicago
Mead, G. H. (1934). Mind, self, and society. Chicago: The University of Chicago
Press
Meister, D. (1999) The History of Ergonomics and Human Factors New York:
Lawrence Erlbaum
Melucci, A. (1989) Nomads of the Present London: Radius
Menzal, H. (1978) 'Meaning-Who Needs it? In Brenner, M., Marsh, P., and Brenner,
M. (eds.) The Social Contexts of Method New York: St. Martin's Press pp.140- 171.
Merleau-Ponty, M. (1962) Phenomenology of Perception London: Routledge &
Kegan Paul
Merton, R., 1957. Social Theory and Social Structure. Glencoe, IL: Free Press
Miller, D. (1995) Acknowledging Consumption: A Review of New Studies London:
Routledge
Miller, J.G. (1956) 'toward a formal theory for the behavioural sciences' in White,
L.D. (ed.) The State of the Social Sciences Chicago: The University of Chicago Press
pp.29-65.

Miller, J. (1978) Living Systems New York: McGraw Hill


Mohan, M. (1993). Organizational Communication and Cultural Vision. Albany:
State University of New York Press.
Mok, C. (1996). Designing Business. Adobe Press: San Jose, CA.
Molina, A. (1989) 'Transputers and parallel computers: building technical
capabilities through socio-technical constituencies' PICT Policy Research Paper No.
7, Edinburgh University.
Molina, A. (1990) 'Transputers and transputer-based parallel computer: Socio-
technical constituencies and build up of British-European capabilities in information
technology', PICT Policy Research Paper No. 19, Edinburgh University.
Molina, A. (1993) 'In search of insights into the generation of techno-economic
trends: Micro- and macro-constituencies in the microprocessor industry' Research
Policy, Vol.22, Nos.5/6, 1993, pp.479-506.
Molina, A. (1994) Technology diffusion and RTD programme development: What
can be learnt from the analysis of socio-technical constituencies? DGXII Occasional
Paper XII-378-94. Brussels: C.E.C./DGXII_A/5.
Molina, A. (1995) 'Sociotechnical Constituencies as Processes of Alignment: The
Rise of a Large-Scale European Information Technology Initiative' Technology in
Society, Vol.17, No.4, 1995, pp.385-412.
Molina, A. (1999) 'Transforming visionary products into realties: constituency-
building and observacting in NewsPad' Futures 31 pp.291-332

417
Molina, A and Nicoll, D. (1996) Contextual usability and the alignment of users and
technology in the development of NewsPad: Towards a pilot methodology
TechMaPP Working Paper , The University of Edinburgh
Moores, S. (1993) Interpreting Audiences: The Ethnography of Media Consumption
London: Sage
Morace, F. (1995) 'The magical mirror of everyday life' in Mendini, A., Branzi, A.
and Marzano, S. (eds) Television at the Crossroads London: Academy Editions
pp.12-18.
Morley, D. (1980) The 'Nationwide' Audience: Structure and Decoding. London: BFI
Morley, D.(1986) The Family Television: Cultural Power and Domestic Leisure
London: Comedia
Morley, D. (1992) Television, Audiences and Cultural Studies, London: Routledge
Mumford, L. (1964) 'Authoritarian and Democratic Technics' Technology and
Culture 5 pp.1-8.
Nardi, B. (1992) 'Studying Context: A Comparison of Activity Theory, Situated
Action Models and Distributed Cognition' Proceedings East-West Intl. Conference
on Human-Computer Interaction, St Petersburg, Russia.
Neisser, U. (1976) Cognition and reality. San Francisco: Freeman.
Nelson, R. and Winter, S. (1977) 'In search of useful theory of innovation' Research
Policy 6, pp.36-76.
Neuman, W.R. (1991) The Future of the Mass Audience Cambridge: Cambridge
University Press
Newell, S. Swan, J. and Clark, P. (1993) 'The importance of user design in the
adoption of new information technologies: The example of production and inventory
control systems (PICS)', International Journal of Operations & Production
Management. Vol.13 No.2. pp. 4-22.
Nicoll, D.W. (1995) Contextual usability: A methodological outline of contextual
usability and quality function deployment in the development of advance media
products TechMaPP Working paper, Dept. Of Psychology , University of Edinburgh
Nicoll, D.W. (1999a) 'Users as currency; technology and marketing trials as
naturalistic environments' The Social Shaping of Multimedia A4 Workshop
Proceedings Brussels: Office for Official Publications of the European Communities
Nicoll, D.W. (1999b) Contextual Usability, Domestication & QFD Proceedings of
the 11th North American Symposium on QFD, Novi, Michigan, 3-9 June
Nicoll, D. (1998) 'Taxonomy of Information Intensive Products' Working paper 5.
Design Council Project 'Increasing information intensity: Towards 'intelligent
products'' The University of Edinburgh Management School.
Nielsen, J., R. L. Mack, K. H. Bergendorff, & N. L. Grischkowsky, (1986).
'Integrated Software Usage in the Professional Work Environment: Evidence from
Questionnaires and Interviews.' In CHI'86 proceedings, SIGCHI Bulletin, Special
issue, pp.162-167.

418
Nielsen, J, (1993) Usability Engineering. New York, NY: Academic Press.

Nonaka, I. (1991) 'The knowledge creating company' Harvard Business Review,


69(6): pp.96-104.

Nonaka, I. and Takeuchi., H. (1995) The Knowledge-Creating Company New York:


Oxford University Press
Norman, D. A. (1988) The Psychology of Everyday Things New York: Basic Books
Norman, D. A. (1991) 'Cognitive Artifacts' In Carroll, J.M. (ed.) Designing
Interaction: Psychology at the Human-Computer Interface. Cambridge, UK:
Cambridge University Press. pp. 17-38.
Norman, D. A. (1992). Turn Signals are the Facial Expressions of Automobiles.
Reading, MA: Addison-Wesley Publishing Company, Inc.
Norman, D. A. (1993). Things That Make Us Smart. Reading, MA: Addison-Wesley
North, S. M. (1987). The Making of Knowledge in Composition: Portrait of an
Emerging Field. Upper Montclair, NJ: Boynton/Cook.
Nyström, H., (1990) Technological and market innovation: Strategies for product
and company development. Chichester: Wiley and Sons.
Olson, D., R. (1977). 'From utterance to text: The bias of language in speech and
writing.' Harvard Educational Review, 47(3), pp.257-281.
Orna, E. and Stevens, G. (1995) Managing Information for Research. Buckingham:
Open University Press
Orlikowski, W. J. (1992) The Duality of Technology: Rethinking the Concept of
Technology in Organizations. Organization Science 3: 3 (Aug), pp. 398-427.
Ortt, R.J. and Schoormans, J.P.L. (1993) 'Consumer research in the development
process of a major innovation' Journal of the Market Research Society 35, no 4.
Pacey, A. (1992). The Culture of Technology. Cambridge, MA: MIT Press
Parsons, T. (1971) 'Action Systems and Social Systems' in Parsons, T. (ed.) (1971)
The System of Modern Societies, London: Prentice-Hall pp. 4-8
Pavlík, J.V. (1996) New Media Technology: Cultural and Commercial Perspectives
London: Allyn and Bacon
Pelly, C. (1996) Creative Consciousness: Designing the Driving Experience Design
Management Journal, Vol. 7, No. 4
Pelto, P. J., and Pelto.G.H. (1978) Anthropological Research: The Structure of
Inquiry, 2nd ed. Cambridge: Cambridge University Press.
Peppers, D. and Rogers, M. (1997) Enterprise One-to-one London: Piatkus
Petroski, H. (1994) Design Paradigms: Case Histories of Error and Judgement in
Engineering. Cambridge: Cambridge University Press

Pine, B.J. & J. H. Gilmore. (1999). The Experience Economy: Work is Theater &
Every Business a Stage. Boston, MA: Harvard Business School Press

419
Polanyi, M. (1958) Personal Knowledge: Towards a Post-Critical Philosophy
Chicago: University of Chicago Press
Polanyi, M. (1966) The Tacit Dimension. London: Routledge and Kegan Paul
Poole, S.M. and McPhee, R.D., (1985) 'Methodology in Interpersonal
Communication research.' in Knapp, M.L. and Miller.G.R. (eds.) Handbook of
Interpersonal Communication London: Sage pp.100 –171.
Postman, N. (1993) Technopoly: The Surrender of Culture to Technology. New
York: Vintage Books
Preece, J., (1993) A Guide to Usability London: Addison Wesley
Proctor, R. and Williams, R. (1994) 'Beyond design: Social learning and computer-
supported cooperative work: Some lesson from innovation studies' (draft paper
eventually published in Shapiro, D., Tauber, M. and Traunmueller (eds.) The Design
of Computer-Supported Work and Groupware Systems Netherlands: Elsevier Science
Punie, Y. (1997) 'Imagining "non-uses": Rejections of ICTs in Flemish households'
Paper presented at Imagining Uses, First International Conference, France Telecom,
Bordeaux, 27-29 May.
Pyka, A. (1997) 'Informal networking' Technovation (17) 4 pp.207-220
Pylyshn, Z., (1991) 'Some comments on the theory-practice gap.' In J. Carroll (ed.)
Designing Interaction. Cambridge: Cambridge University Press pp. 39-49.
Quade, E.S. (1975) Analysis for Public Decisions. New York, NY: Elsevier.
Rafaeli, S. (1988). 'Interactivity: From new media to communication.' In Hawkins, R.
et al. (Eds.) Advancing communication science: Merging mass and interpersonal
process (16). Newbury Park, CA: Sage. pp. 110-134.
Ramstrom, D.O., (1974) 'Toward the Information-Saturated Society' in H.Leavitt, L.
Pinfield & E. Webb (Eds.), Organizations of the Future: Interaction with the
External Environment, New York: Praeger pp.159- 75.
Rapoport, A. (1969) House, Form and Culture Englewood Cliffs, NJ: Prentice-Hall

Raven, M.E. and Wixon, D. (1994) 'Contextual Inquiry: Grounding Your Design in
User's Work' Proceedings of ACM CHI'94 Conference on Human Factors in
Computing Systems v.2 pp.409-410.
Redish, J. and Selzer, J. (1989) 'The place of readability formulas in technical
communication.' Technical Communication 32(4), pp.46-52.
Resnick, M. (1994) Turtles, Termites, and Traffic Jams. Cambridge, MA: The MIT
Press.
Rice, R. E. and Case, D. (1983) 'Computer-based messaging in the university: A
description of use and utility.' Journal of Communication, 33: pp.131-152.
Rice, R. E., & Shook, D. E. (1990). 'Relationships of job categories and
organizational levels to use of communication channels, including electronic model:
A meta-analysis and extension.' Journal of Management Studies, 27, pp.196-229.

420
Ricoeur P (1971) 'The model of text.' Social Research, 38, p.331
Robertson, T.S. (1967) 'The process of innovation and the diffusion of innovation'
The Journal of Marketing, 31 pp.14-19.
Robertson, T.S. (1971) Innovative Behaviour and Communication New York: Holt,
Rinehart and Winston
Robson, C. (1993). Real World Research. Oxford: Blackwell.
Rogers, E.M. (1995). The Diffusion of Innovations, 4th ed. Free Press, New York.
Rosenberg, N (1982) Inside The BlackBbox: Technology and Economics,
Cambridge: Cambridge University Press
Rosenberg, N. (1994) Exploring the Black Box Cambridge: Cambridge University
Press
Rosenblueth, A.; Wiener, N. and Bigelow, J. (1943) "Behavior, Purpose and
Teleology," Philosophy of Science 10:pp. 18-24.

Roth, R.G.(1987). 'The evolving audience: Alternatives to audience accommodation.'


College Composition and Communication, 38, pp.47-55.

Rubin, J. (1994). Handbook of usability testing: How to plan, design, and conduct
effective tests. New York, NY: John Wiley & Sons.

Rycroft, R.W.and Kash, D.E. (1999) The Complexity Challenge: Technological


Innovation for the 21st Century London: Pinter
Ryle, G. (1949) The Concept of Mind. New York: Barnes & Noble, Inc.
Saffo, P. (1993) 'Its the context, stupid' Wired 2.03
Sahal, D. (1981) Productivity and Technical Change, 2nd. Ed. Cambridge: Cambridge
Univ. Press
Santosus, M. (1998) Information Micromanagement, CIO Enterprise Magazine,
April 15.
Sands, S., (1980) 'Test marketing: Can it be defended?' Quarterly review of
Marketing, 5, pp.1-10.
Sarup, M. (1992) Jacques Lacan. Harvester Wheatsheaf.

Schiffman, S. J., Meile, L. C., & Igbaria, M. (1992). An examination of end-user


types. Information Management, 22 (4), pp.207-215.

Schein, E.H. (1985). Organizational culture and leadership. San Francisco: Jossey-
Bass.

Scott, W.G., (1961) 'Organization Theory: An Overview and an Appraisal' Academy


of Management Journal, 4 pp. 7-26.

Segal, Quince, Wickstead (1986) The Cambridge Phenomenon, The Growth of High
Technology Industry in a University Town Cambridge: Covent Garden Press

421
Seiter, E., Borchers, H., Kreutzner, G. & Warth, E. (1989) 'Don't treat us like we're
so stupid and naïve': Toward and ethnography of soap opera viewers in Seiter, E.,
Borchers, H., Kreutzner, G. & Warth, E. (eds.) Remote Control: Television,
Audiences and Cultural Power, London: Routledge pp. 44-55.

Shackle, G.L.S. (1963). 'General Thought Schemes and the Economist', reprinted in
Ford, J.L. (ed.) Time Expectations and Uncertainty in Economics: Selected Essays of
G.L.S. Shackle Edward Elgar, New York.

Shanklin, W. L. and Ryans, J. K., Jr. (1984) Marketing High Technology. Lexington,
Mass.: Lexington Books

Sharrock, W. and Anderson, R (1994) 'The user as a scenic feature of the design
space' Design Studies 15(1), pp. 5-18.

Shotter, J. (1975) Images of Man in Psychological Research. London: Methuen.

Silverman, D. (1985) Qualitative Methodology and Sociology: Describing the Social


World. Brookfield, Vermont: Gower.
Silverstone, R. and Morley, D., Dahlberg, A., & Livingstone, S. (1989) 'Families,
technologies and consumption' Working paper, Centre for Research in Innovation,
Culture and Technology, Brunel University

Silverstone, R. and Hirsch, E. and Morley, D. (1991) 'Listening to a long


conversation : An ethnographic approach to the study of information and
communication' Cultural Studies, vol.5 no.2 May pp. 204-227
Silverstone, R. and Hirsch, E (eds.1992) Consuming Technologies: Media and
Information in Domestic Spaces London: Routledge
Silverstone, R. (1989) Television: Text or Discourse? Science as Culture 6
Silverstone, R. (1994) Television and Everyday Life London: Routledge
Silverstone, R. and Haddon, L. (1996) 'Design and the domestication of information
and communication technologies: Technical change and everyday life' in Mansell, R.
and Silverstone, R. (eds.) Communication by Design; The Politics of Information and
Communication Technologies Oxford: Oxford Univ. Press pp. 44-74
Simon, H.A. (1987). "Making management decisions: The role of intuition and
emotion". Academy of Management Executive, 1, pp. 57-64.

Simon, H (1996) 3rd Ed. Sciences of the Artificial Cambridge, MA: MIT

Simonsen, J. (1996): 'Involving Customer Relations in Contextual Design - a Case


Study' in J. D. Coelho et al., (Eds.): Proceedings of the 4th European Conference on
Information Systems, Lisbon/Portugal, July 2-4 1996, pp. 1153-1161.

422
Simonsen, J. and F. Kensing (1994): 'Take users seriously, but take a deeper look:
Organizational and technical effects from designing with an ethnographically
inspired approach", in Trigg et al., (Eds.): Proceedings of the Third Biennial
Conference on Participatory Design, October 27-28,
Simonsen, J. and F. Kensing (1997): "Using Ethnography in Contextual Design",
Communications of the ACM, Vol. 40, No. 7, July, pp. 82-88.
Simonsen, J. and F. Kensing (1998): "Make Room for Ethnography in Design",
ACM-SIGDOC Journal of Computer Documentation, Vol. 22, No. 1, February, pp.
20-30.
Smith, J. A., Harré, R. & Van Langenhove, L. (1995). 'Introduction' in J. A. Smith,
R. Harré and L. V. Langenhove (eds.), Rethinking Psychology. London: Sage.
Smythe, D. (1981) Dependency Road: Communication, Capitalism, Consciousness
and Canada Norwood, NJ: Ablex
Sobchack, V. (1996) 'Democratic franchise and the electronic frontier' in Sardar, Z.
and Ravetz, J.R. (eds) Cyberfutures: Culture and Politics on the Information
Superhighway London: Pluto pp.77-89
Sørensen, K.H. (1993) Adieu Adorno: The moral emancipation of consumers' in
Berg, A.J. and Aune, M (eds.) Domestic Technology and Everyday Life, Proceedings
from COST A4 Workshop in Trondhiem, Norway Oct. 28-30 1993 pp.157-169.
Spradley, J. P. (1979) The Ethnographic Interview. New York: Holt, Rinehart and
Winston.
Spender J.C., and Grant, R.M. (1996). "Knowledge and the firm: Overview".
Strategic Management Journal, 17, Winter, Special Issue, pp- 5-9.
Spigel, L. (1992) Make Room for TV Chicago: University of Chicago Press
Sproull, L. and Kiesler, S. (1991) Connections Cambridge, MA: MIT Press
Star, S. L. (1989). The Structure of Ill-Structured Solutions: Boundary Objects and
Heterogeneous Distributed Problem Solving. In Gasser, L. and Huhns, M.N. (eds.)
Distributed Artificial Intelligence. Vol. II. San Mateo, CA: Morgan Kaufmann
Publishers, Inc. pp. 37-54.
Star, S.L. (1995) Introduction. In S.L.Star, ed. Ecologies of knowledge: Work and
politics in science and technology. Albany, NY: SUNY Press.
Steadman, P (1979) The Evolution of Designs. Cambridge: Cambridge University
Press.
Stevenson, N. (1995) Understanding Media Cultures: Social Theory and Mass
Communication London: Sage
Stewart, C. M. (1992). 'Innovation is in the mind of the user: A case study of voice
mail. In Gattiker U. E. (Ed.), Technology-mediated communication, 3 pp. 151-185.
Berlin: Walter de Gruyter.
Stick, K.D. and Toth, N., (1993) Making Silent Stones Speak: Human Evolution and
the Dawn of Technology London: Weidenfeld & Nicolson
Straub, D. and Wetherbe, J. (1989). 'Information Technologies for the 1990s: An

423
Organizational Impact Perspective' Communications of the ACM, 32(11): pp.1329-
1339.
Staudenmaier, J. (1985). Technology's storytellers: Reweaving the human fabric.
Cambridge, MA: MIT Press

Stinchcombe, A. (1968) Constructing Social Theories. New York: Harcourt, Brace


Suchman, L. A., (1983) 'Office procedures as practical action.' ACM Transactions on
Office Information Systems 1: pp.320-328.
Suchman, L.A. (1988). Plans and Situated Actions: The Problem of Human/Machine
Communication. Cambridge, UK: Cambridge University Press.
Suchman, L. A. (1992) 'Technologies of Accountability: On Lizards and Airplanes.'
In Technology in Working Order: Studies of Work, Interaction and Technology. G.
Button, (ed.)London: Routledge. pp. 113-126

Suchman, L. A. (1995). 'Making work visible' Communications of the ACM, 38(9),


pp. 57-65
Suchman, L.A. and Trigg, R.H., (1991). 'Understanding practice: Video as a medium
for reflection and design.' In J. Greenbaum and M. Kyng (Eds.), Design at work:
Cooperative Design of Computer Systems, Hillsdale, NJ: Lawrence Erlbaum
Associates.
Suchman, M.C., (1995) 'Managing legitimacy: Strategic and Institutional
approaches.' Academy of Management Review, 20: pp.571-610.
Sweeney, G. (1996). "Learning efficiency, technological change and economic
progress". International Journal of Technology Management, 11(1-2), pp. 5-27.
Szent-Gyorgyi, A. (1971) 'Looking back' Perspectives in Biology and Medicine, 15,
pp.1-5.
Takeuchi, H. & Nonaka, I. (1986) The new new product development game.
Harvard Business Review, 64(1): pp.137-146.
Talbot, S. (1997) 'Do Computers Kill People?' NetFuture: Technology and Human
Responsibility #37 O'Reilly & Associates
http://www.oreilly.com/people/staff/stevet/netfuture/1997/Jan0897_37.html#1d
Tatsuno, S. (1993) 'Power to the people' Design (November) London: The Design
Council
Tauber, E.M. (1974) 'How market research discourages major innovation' Business
Horizons 17, pp.22-26.
Tauber, E.M. (1975) 'Why concept and product tests fail to predict new product
results,' Journal of Marketing, 39, pp.69-71.
Tauber, E.M. (1977) 'Forecasting sales prior to test market' Journal of Marketing, 41,
pp.80-84.
Tauber, E.M. (1981) 'Utilization of concept testing for new product-forecasting:
traditional versus multiattribute approaches', in Wind, Y., Mahajan, V. and Cardozo,

424
R.N. (eds) New Product Forecasting Lexington, MA: Lexington Books, pp.173-185.
Taylor, C. (1977) 'Interpretation and the sciences of man' In Dallmayr, F.R. &
McCarthy, T.A. (eds.) Understanding Social Inquiry Notre Dame, IN: University of
Notre Dame Press pp.101-131.

Taylor, F.W. (1911). The Principles of Scientific Management. New York: Harper

Teece, D.J. (1987). "Profiting from technological innovation". In P.J. Teece (ed.).
The competitive challenge, Ballinger, Cambridge, MA. pp. 185-219.

Teece, D.J. (1988). "Technological change and the nature of the firm". In G. Dosi,
C.Freeman, R.Nelson, G. Silverberg & L. Soete (eds.). Technical Change and
Economic Theory, Pinter, London pp.256-281.

Teece, D.J. (1990). "Contribution and impediments of economic analysis to the study
of strategic management". In J.W. Fredricson (Ed.). Perspectives on Strategic
Management, Harper, New York, pp. 39-80.

Teece, D.J., Pisano, G., and Schuen, A. (1997). "Dynamic capabilities and strategic
management". Strategic Management Journal, 18(7), pp. 509-533.
Thackara, J. (1988) (ed) Design After Modernism, Thames & Hudson, London.
Tharp, M. and Scott, L. M. (1990). 'The Role of Marketing Processes in Creating
Cultural Meaning' Journal of Macromarketing, Fall, pp.47-60.
Thomson, A.A., Strickland, A.J. and Fulmer, W.E., (1969) 'Reading in Strategic
Management,' Plano, Business Publications p.5.
Thompson, C. J. , Locander, W. B. and Pollio, H. R. (1989). Putting Consumer
Experience back into consumer research: The philosophy and method of existential-
phenomenology. Journal of Consumer Research, 16 pp.133-147.
Thompson, C.J., Pollio, H. R., & Locander, W. B. (1994). 'The spoken and the
unspoken - A hermeneutic approach to understanding the cultural viewpoints that
underlie consumers expressed meanings.' Journal of Consumer Research, 21 pp.432-
452.
Toulmin,S. Rieke, R. and Janik, (1984) A An Introduction to Reasoning New York:
Macmillan

Trist, E. L. & Bamforth, K. W. (1951), 'Some Social and Psychological


Consequences of the Longwall Method of Coal-Getting', Human Relations, Vol. 4,
pp. 3-38.
Turkle, S., (1984) The Second Self: Computers and the Human Spirit. London:
Granada.
Turkle, S. (1995). Life on the screen. New York: Simon & Schuster.
Urban, G.L., Hauser, J.R. and Dholakia, N. (1987) Essentials of New Product
Management Englewood Cliffs, NJ: Prentice-Hall

425
Utterback, J. (1979) 'The dynamics of product and process innovation' in Hill, C. and
Utterback, J. (eds.) Technological Innovation for a Dynamic Economy New York:
Pergamon
Utterback, J. (1994) Mastering the Dynamics of Innovation Boston, MA: Harvard
Business School Press
Von Bertalanffy (1949). 'The concepts of systems in physics and biology' Bulletin of
the British Society for the History of Science, 1, pp.44-45

Von Foerster, H. (1984). On construction reality. In P. Watzlawick (Ed.), The


invented reality New York: Norton pp. 41-61
Von Hippel, E. (1978) 'Successful industrial products from customer ideas. Journal
of Marketing, 42(1): pp.39-49
Von Hippel E. and Tyre, M.J. (1996) 'The mechanics of learning by doing: Problem
discovery during process machine use' in Technology and Culture V. 37 No. 2 April
1996
Vygotsky, L.S. (1978) Mind in Society Cambridge, MA: Harvard University Press
Vygotsky, L.S. (1986). Thought and language. The MIT Press, Cambridge, Mass.
Wallman, S. (1984) Eight London Households London: Tavistock
Walsh, V. (1996) 'Design, innovation and the boundaries of the firm' Research
Policy, 25, pp.509-529.
Weber, M. (1947) The Theory of Social and Economic Organization, tr. by A. M.
Henderson and Talcott Parsons, (ed.) New York, Oxford University Press.

Walkerdine, V. (1990) "Video Replay: Families, Films and Fantasy." Reprinted in


Manuel Alvarado and John O. Thompson (eds.), The Media Reader. London: BFI pp.
339-357

Wartofsky, M. (1979). Models, representation and scientific understanding. Boston:


Reidel.

Wasserman, N.H. (1985) From Invention to Innovation London: John Hopkins Press

Wasserman, S., & Faust, K. (1994). Social network analysis: Methods and
applications. Cambridge: Cambridge University Press

Webster, F. (1994) Theories of the Information Society London: Routledge

Weiser, M. (1991) 'The Computer for the twenty-first century' Scientific American
265 (3) pp.94-104.
Weizenbaum, J. (1976) Computer Power and Human Reason: From Judgement to
Calculation New York: W. H. Freeman.
Wellman, B. (1988b). Structural analysis: From method and metaphor to theory and
substance. In B. Wellman & S. D. Berkowitz (Ed.), Social structures: A network
approach Cambridge: Cambridge University Press. pp. 19-61

426
Westrum, R (1991) Technologies & Society : The Shaping of People and Things
Belmont, CA:Wadsworth Pub. Co.

Wheatley, M. J. (1992) Leadership and the New Science: Learning about


Organisation from an Orderly Universe. New York: Berret-Koehler

Wheatly, M.J. and Kellner-Rogers, M. (1996) A Simpler Way San Fransisco: Berret-
Koehler

Wheen, F. (1985) Television London: Century


Whiteside, J. and Wixon, D. (1987a) 'The dialectic of usability engineering' in
Bullinger, H.J. and Shakel, B. (eds.) Proceedings of Human-Computer Interaction –
ITERACT '87 North-Holland: Elsevier
Whiteside, J., Bennet, J. and Holtzbatt, K., (1988) 'Usability Engineering: our
experience and evolution,' in Handbook of Human Computer Interaction (Helander,
M. ed.) North-Holland: Elsevier Science Publishers.
Wiener, N. (1948/1961). Cybernetics: Or control and communication in the animal
and machine. Cambridge, MA: MIT Press.

Wiklund, M., (1994) Usability in Practice: How Companies Develop User-Friendly


Products. Cambridge, MA: AP Professional.
Williams, R. (1974) Television: Technology and Cultural Form. London: Fontana
and Collins
Wilson, P. J. (1988) The Domestication of the Human Species London: Yale
University Press
Winograd, T., & Flores, F. (1986). Understanding computers and cognition: A New
Foundation for Design Norwood NJ: Ablex.
Winston, B. (1986), Misunderstanding Media, London: Routledge and Kegan Paul
Wittgenstein, L. (1953) Philosophical Investigations Oxford: Blackwell
Wixon, D, Holztblatt, K. and Knox, S. (1990) 'Contextual design: an emergent view
of system design' in Proceedings of CHI'90 Seattle New York: ACM pp.329-336.
Woolgar, S. (1991) 'Configuring the user' in J,Law (ed.) A Sociology of Monsters
London: Routledge, pp.57-59
Woolgar, S. (1996) 'Technologies as cultural artefacts' in Dutton, W.H. (ed.)
Information and Communication Technologies: Vision and Realties Oxford: Oxford
Univ. Press pp.87-102.
Yin, R.K. (1989) Case Study Research London: Sage
Zuboff, S., (1984) In the Age Of the Smart Machine: The Future of Work and Power.
New York: Basic.

427
Appendix 1- The Cambridge Trial Participants

Who was on the trial?


As outlined in the previous chapter, Om faced a recruitment problem with Phase two
of the trial. However this was, solved partly by recruiting via schools in particular
areas of Cambridge. It was anticipate that by giving schools authorware, family
households would be attracted to participating through being able to have a 'gateway'
into their children's school and their schoolwork.

In July/August 1996 there were 66 households in total and these were interviewed
via telephone by NOP, who asked them a 10 item questionnaire session. The
following data was based on their findings. 31 of these were family homes with
children. In all there were 210 people involved in the trial, 56 of these were children
under the age of 16 (average age 3.5 years). There were 11 children over 16.

The average age of the chief income earner was 37 years old. Of these (n=66) 29
belonged to demographic group AB, thirty-three to C1C2 and four to group DE.
Most of these described their households as light (26) or medium (24) television
viewers as opposed to heavy viewers (16).

Perhaps unsurprisingly, most of the chief income earners watched television 7 days a
week, regardless of whether they were light, medium or heavy users, and regardless
of demographic. However, the amount of time which was spent viewing television
each day was less in the higher demographic group AB (approx. 3 hrs. a day) than
group C1C2 (approx. 3.75 hrs. a day) and DE (approx. 5hrs a day). A quarter of the
families with children (of any age) watched between 2 and 3 hours a day.

PC ownership
Approximately half the trialists (37 56%) owned a home PC (not a games console).
Most owners were in the higher demographic group (20 69%) against (16 48%) of
C1C2 and 1 person from of the DE group. There seems some relations between PC
ownership and weight of television viewing. 73% PC ownership in the case of light
television viewers compares with 25% PC owner ship in the case of heavy television
viewers.

428
Weight of TV viewing

Light Medium Heavy


TOTAL 26 24 16
Yes, own a PC 19 (73%) 14 (58%) 4 (25%)
No 7 (27%) 10 (42%) 12 (75%)

The was roughly an even split in PC ownership/non-ownership (45%/55%) between


those households with children. Pat of this was due of course to the households
meeting the selection criterion (which was pendent on PC ownership).

The information derived from this study is weak in the sense of illustrating anything
of the dynamic of household affairs above and beyond that pertaining to the question
items.

The sample for the qualitative interviews


From the 66 an initial sample of 11 houses were chosen for the qualitative research.
In terms of household composition these were;

F002 2 adults, both working. No children at the address.


F010 Couple, 2 children (1 boy 1 girl). She - solicitor, he - local Councillor
F011 Couple. 3 older children 1 of which is studying.
F016 Couple, 2 children.
F018 Young couple, not married, both working
F019 Single male, working.
F025 Couple, both elderly & retired
F028 Couple, both elderly & retired.
F033 2 adults, single, studying
F040 Couple, 1 child age 13
F071 Couple, Both working.

N.B. Number donates the identity of the household by sys-log.

This sample was supposed to reflect the criterion as defined by Laura helm (NOP)
and Will Collin/Jon Wilkins (BMP DBB)(outlined in Chapter 9). There was another
household added to the list.

The sample were recruited by a London-based fieldwork recruitment agency used by


NOP - Phoenix Fieldwork Ltd. These were responsible for contacting those
households which were allocated to the qualitative study, asking them to participate
in our project, and arrange the times for the interviews. In the case of my interviews
they had failed to confirm the interviews on the day, thus on arrival at one of my
households at the appointed time the interviewee was found to be out. The other
interviews, although completed, had to be rearranged from the scheduled times. This
was because people were not available due to their own rearranged plans, family in
hospital, and one household busy making dinner for their lodgers.

429
The 12 household were evenly divided equally between the three participating
research teams -myself, NOP and BMP DBB. We each had a discussion/checklist
which was drafted by BMP DBB. This had emerged from sharing of issues (see
previous chapter), and substantive work I had done in the way of questionnaire
development. The intention was that this would be implemented in a open ended way
to promote discussion of the use of the system.

In each case, all the of the family present were presented with the questions. Answers
were spontaneous and sometimes negotiated and/or contested. The guide provided
some form of guided towards the conduct of the interview. This was of importance in
this study due to the use of multiple interviewers. Interview guides provide some
element of standardisation across interviewers, where their approaches, manner,
appearance, communication abilities etc. may vary (Robson, 1993; p236). Most
households had been contacted by Om in April 1995, and connected in September,
1995.

CASE STUDY 1 - FO40


Two adults and one daughter (aged 14 year old at time of interview). The father and
daughter were interviewed. A two TV set family, the daughter had her own set in
her room, although this had only been there for six months. They described
themselves as heavy viewers, using both cable and video recorded programmes,
although the father stated that TV does not intrude into their lives as they "tend not to
be driven by it. There was some contention between them in respect to what the
daughter actually used and which programmes she viewed. They did not watch
soaps. Their main motivation for signing on to the trial was to get the free movie
channel as well as being curious; they wanted "to see what is going to be involved".

They had understood the term of "video-on-demand" to mean "You pick up a film
[such as from the video store] that [or when] you missed the programme . . . The
impression was given that you if you missed a programme after a couple of hours it
would be on ITV you just (click) and put it back on the interactive thing."

The father felt that he had "got hold of the wrong end of the stick", in that he
anticipated that the i-Tv service would have permitted the pulling up of programmes
he may have missed. Of course due to the content limitations on the trial such a
facility was not available, however he "had watched the news when something big
has happened and we have missed it." The daughter had watched all of the survival
programmes. They had listened to a "couple of the radio programmes but they don't
seem to change very often so it's the sort of thing you go in there and you click it
through and you think you watch the screen and you get a radio programme and it
feels odd." Scaling the service against the Internet it was valued as a "1 or 2" (out of
10) by both father and daughter.

The father's main use was now "90% Internet". If he really wanted to use Internet he
felt he would probably "hire a PC and get the proper gear and do it properly." In
respect to paying for services the father "hadn't given it a thought," but felt that if the
service included the Internet he would pay for it (the £5 suggested by the interviewer,

430
and possibly £10). There was some contention around the daughter's use of the
Internet facility. The father saw that she could not access the Internet through the
system, but she insisted that she could but could not "get anything I want to." They
had managed to access a pornography site which should have been filtered. This
painted quite a different picture from the father's description of himself "charging
through the Internet". They stated that he now accessed the system a once a week,
predominately to use the Internet, the daughter on the other hand used it quite often
when it first arrived and then "completely forgot about it." The daughter described
the mother as using it "occasionally about the same as me", which was about "About
twiceish" a month. In respect specifically to the Internet faculty the daughter viewed
that she thought that the mother "hadn't actually found the time to go through it. The
father however, disagreed, saying that his wife had accessed "hardly at all."

The father was not a Nat West customer, however he was interested in what Nat
West were doing (he had previously worked for Barclays). Again, this was driven by
him being "just intrigued with what they were doing and how they were doing it."
The girl felt that she was not even drawn to access the Nat West service due to the
"because the bit at the beginning when he's (the actor) sitting in the chair and it's so
irritating."

Their reaction to the notion of advertising was favourable. They had associated the
banner ads on the WWW to be that which was paying for the service. The father
could not understand why the BBC "doesn't have adverts." The advantage of being
able to "zap" through the ads was only seen as a slight benefit to the father, who was
not bothered by them. The daughter recognised that the adverts were sometimes "a
lot better than the programmes." As regards to target marketing the father
rationalised the process, associating it with junk mail, and criticised its weaknesses in
relation to properly asserting an individual's status and circumstance. On-line market
research, the on-line questionnaire they found a "fairly laborious process." He also
felt that he was "just trying to feed good lines through."

In terms of usability the father found it a "Piece of cake . . . Very easy, very slow."
He was "tempted to say you can lose interest, but I won't, when you are waiting for it
to happen." However, felt that "too slow is pushing the case but it is slow but I
wouldn't say too slow." He saw that this was relative, and depended on what "what
you are used to."

They did not subscribe to movies and sports channels as they are not interested (he
felt like they only watch in the region of one movie a month). His job involved
evening work and demanded flexibility in terms of time. He did not know when he
was going to be in the evening and if he were out in the evening he would be unsure
of which time he returned.

In respect to his view of the future of i-Tv he saw that they (those managing the trial)
would have to sort out a lot of "bugs" and "get rid of things." They would have to
speed up the operation of the system as people are used to "instantly entering." They
would have to raise the content level. He was aware of the problems with copyright
and performance rights as the main obstacle to the system's success. He felt that once

431
VOD and the Internet were working then they could take time to develop the other
services. He also noted that much of the local intelligence that was supposed to come
through the i-Tv was actually on the Internet. In terms of hardware he would like to
have an add-on alpha-numeric keyboard (for the Internet).

Case study 2 - F011


The second case study was a family home which comprised of three children and a
mother (who was at home - a full time carer for her children) and father. Two of the
children were present for the interview. They were a four television family (one was
recognised by the mother as broken). One set in the living room, with a VCR, and
one in each of the kids rooms (3 bedrooms).

Their patterns of viewing were mainly the children in the afternoons, with the
children's programmes. The mother generally watches in the early morning and in
the early evening with soaps. The father's chief viewing time was from 7pm
onwards till about 10pm when he often goes to bed (he rises at 5.30am). He was
particular and would watch strategically. The mother would sometimes video
material in the morning (mainly soaps such as Sons and Daughter and Neighbours) to
watch after 10pm, or would watch a late film. The children also watch the Cartoon or
Disney Channels. The children get to dominate the TV on Saturdays (through the
day).

They evidenced a strong strategic position in their viewing using the VCR. The
mother could have up to "two hours to watch of videos a day" (Mainly pre-recorded
soaps). Due to her obvious heavy use of video material she welcomed the notion of
having on-demand soaps. They were subscribed to cable and had the movie option,
although they were pensive with regards to how much value it was for them. They
would sign to the movie option in winter, using their video recorder to record a "lot
of films, 2 which they would then watch late at night in the summer when they spend
more time sitting outside and going for walks): "That's why we tape more because
then we can choose when we want to watch it." They also taped things for the kids as
well. They had possessed a VCR "for years" and in fact could not "remember when
we haven't had one really." The mother felt that TV tends to "rule you a lot" and that
she "had to record these certain programmes."

The father saw that his wife was "addicted" to certain programmes. He also saw a
similar effect in his son; "It's good control, I mean, like him. Half the time he doesn't
sit down but soon as he comes out of school you put Disney Channel on and he'll sit
glued to it until whenever - teatime." However, it was recognised that "Half the time
it's on and nobody's watching it." The father put this down to a "habit."

They related that they use the radio channels while they were doing things around the
house. The father's main leisure time activity was fishing; "usually go at least twice a
week. I've watched fishing programmes not to get information of them because, I
know quite a lot, you know I fish, I'm a match fisherman, I fish matches every week .
. .I don't spend my life fishing or anything . . . But it is my passion" He alluded to the
fact that watching fishing was "not for information" but rather for enjoyment "I mean

432
you are sitting there watching that float go down and hope it goes down because [he
knows what it feels like]."

Both had completed the on-line questionnaire. They both felt that this method of
answering questionnaires were easier to administer.

In respect to news the father used teletext over i-Tv; "I think at the moment - call it
laziness or what - but if that plug is switched off on the interactive, rather than go up
there, turn that on, get my pin number out and put that in, I can only just flick a
switch on this and I have got teletext on."

They referred to the schools link as being "very good" when it is all up and running.
She thought that it would be a big part of schools. Their child had used it at school
but "he says he's used it, but he hasn't really come home and said what he has been
doing on it." However, even though his school was listed, there was no access as yet.
They had not much contact with the schools via traditional means.

They drew from the promotional video the impression that you could "call up any
programme you want and whatever . . . the way it came across to me was it was like
hiring a video but not paying for it in that sense because you could watch what you
like, watch what you want, stop, go back to it . . . Watch some of the film and go
back to it which is exactly what we do [already with the video] . . . , call up any
programme you want and whatever" It had also mentioned something about the
children and computer games.

The banking services were unattractive, as they were not with Nat West. In respect to
adds, the father did not want advertisements which interrupted programming. He
liked quizzes and the thought of a quiz on a subject matter he liked, particularly if
their were a chance of winning something he liked that. "If you had a chance of
winning something, yeah, I think that would influence a lot more people."

Both felt that i-Tv carried potential for the future; "If it does everything it says it's
going to do then yes - because like the games - it replace - needing your, I mean at
one stage we had got 4 different games consoles and yes, something like that."
[mother] "I mean - 5 [games consoles] we had - but as I say I think if there was a lot
more scope of programmes on, yeah I again, I hardly ever go to a video shop or I
don't go to a video shop to hire a video but there are a lot of things I would like to
watch, not only film-wise, it's great I like things like wildlife - any programmes like
that.

CASE STUDY 3 - F019


The next household 2 was a 3 person shared single household, the interviewee a
young single male. The household viewing activities were relatively insular, with no
communal living room. However, there was a television in the kitchen, which was
black and white and was used like a "stereo . . . when the house is really quiet it's like
having somebody else there in the room isn't it." The interviewee's television was the
only TV connected to cable. The TV was a 25"model, with PROLOGIC sound. His

433
use of television was an hour or two in the evening (after 6pm, perhaps 10.30pm -
11pm weekdays) and about half an hour in the morning. He made several references
to that pointed to the fact that his main use for television was for relaxation, although
he generally felt that this was time "wasted." He had about 5 or 6 programmes that he
would strategically watch per week.

He did not know what his plans were from "one night to the other, " as he tried to
"get out as much as I can." This is especially true in summer, and more generally at
the weekends. He had just recently bought a SONY playstation and this was now
absorbing more of his leisure time. His use of hired pre-recorded videos were rare, as
he does not "often get time to just sit and watch a film." He merited cinema over
television, for new movie releases. However, television was considered to play a
major role in his everyday life; "It is something - it's like a connection to the outside
world and that's why in the morning the first thing I do is switch the TV on and listen
to the news and then if there is anything happening ."

His main motivation for going on the trial was "Because I just thought it offered so
much opportunity to try anything. I am quite, I'm really into technology . . .I think it
is really exciting what was actually offered potentially." These expectations were
dramatically boosted by the literature. This view tempered by actual use had been
jaded by the lack of content but he maintained very positive regarding its potential
which he saw as "incredible."

With respect to educational material he hoped for content which would have added to
lifestyle activities; how to play better sports; cooking; DIY.

He noted that from the time the installation date was suggested it took a further 6 or 8
months to actually connect; "I thought the admin. at the beginning seemed really
good but then when the practice that came along it was just like a bit shaky."

His expectations for the shopping were not so articulated; "I wasn't really sure
because I didn't really understand how the shopping would actually work . . .I just
sort of waited to see how it would work or not as the case may be."

He also expected networked games; "that could link to whoever else was on the
network on the interactive TV at the time so you could actually play against other
people." He felt his television viewing routines would change if the i-Tv system
worked properly. He would still watch his 5 or 6 programmes through the week, but
the "spare time instead of just sitting there and just aimlessly flicking through or just
watching a programme then I would be using the interactive TV instead." He felt
that targeting his viewing, through the choices available with i-Tv would make his
viewing time more "productive."

He found the service slow. The general usability of the machine was considered
good, however not all of the services had been reliable. He claimed that he would not
know how to access local information from teletext, however he associated that the
web site was something specific to Cambridge. He accessed local information 2 or 3
times a month, mainly to locate local intelligence, cinema box office phone nos. etc.

434
While are that i-Tv was still in its developmental stage he felt that it was still a part
of the future; "Yeah, I think it has got a really big input into the future and it has got
so much potential - you know it is quite sort of WOW you know. What could
happen it's just unbelievable but it's just the reality at the moment is a bit sad really.
Yeah I think yeah it has got to be some form of interactive you know who does it or
how we do it - there has got to be some."

Compared to the Internet he found it "It's a lot more user friendly system. It's like
everything you put down well what do you think of the interactive and you say well
if you put some stuff on it, it will tell you. If everything was on it, it would be superb
. . . it's [i-Tv] probably better because it's a much smaller - I mean interactive
[Internet?] is so huge that trying to meander round it is just really difficult. Where
they have gone and made small little area of little titles and so intrinsically they have
got a much easier task to make it easy to use."

The main problem was that they "promised so much and yet didn't have the 10% of
what it promised which tends to give you a bad [feeling]." This respondent also 'read'
into the lack of programming. "They are obviously having financial problems or are
they just - or they have got bad management or are they actually having problems
and about to collapse or you know. He had a girlfriend in London who had insight
into the financial affairs of Om and indicated that they were having difficulties.

CASE STUDY 4 - F018


Interview were a cohabiting couple in their early 30s - no children - who have
student lodgers who board with them. The television is switched on around 3.30pm,
and switched off around 11pm till midnight. In the mornings it is generally switched
on around 6 till 6.30am for news (man) and again from 7 till 7.45 for the Big
Breakfast (lady). The man preferred news and weather, sports and these
documentaries, survival programmes and "that kind of thing," while the woman
preferred viewing was soaps. They both watch quite a few movies as well. They
receive the Movie Channel (provided by the trial) and they had a large collection of
pre-recorded videos over here. Before the trial they would only subscribe to the
movie channels on cable during Christmas. They also subscribed to sports. They
also video tape a lot of material. Since they had lived together they have not bought
as many pre-recorded videos as they did before. They normally tape films and
comedies. They had two TVs in the house, one in the living room and one in their
bedroom. The living room is shared with the lodgers and the bedroom TV is used for
the woman watching soaps. The man to also used the bedroom TV to help him go to
sleep.

Their main motivation for coming on to the trial was curiosity; "I was just interested
to actually see what this new concept could actually do and I liked the ideas of
watching, because one of things is nature programmes and there is a massive
selection of that which is my cup to tea." In respect to their anticipation of using the
woman remembers being told "that you could actually decide what you wanted when
you wanted to watch "But we really honestly, until they actually brought the box in,
we didn't really know what it was going to involve." Services such as shopping were

435
anticipated as "doing the shopping [via the TV] you can place your order and then
pick it up . . . I quite fancied that rather than travelling round Tesco's." The man
thought that this "really does appeal." This was primarily due to them not having a
car.

The woman felt a little reticent about ordering from on-line catalogues, however she
had a catalogue and this was the only place "we could buy some of the furniture from
that we have got. Yeah we try not to use it excessively . . ." They had not used the
interactive news "it's just local news that comes on . . . I mean when you have got
Cable TV and you have got that Style News CNN news on constantly and it's as easy
to flick on to news than it is to go into interactive TV. Isn't it?" They had had some
kids over and they had used the children's material, "but they get bored with it after a
while because it's just the same thing all the time."

They were ambivalent about TV advertising "any sort of TV has got to be paid for
somehow." The woman viewed there could be a place for classified adverts. "So like
in the local paper where you have got different headings for different things like jobs
vacant, accommodation to let that sort of thing." Ads which offered cash prizes,
discounts etc. would be welcomed.

Generally they felt that there was a poor choice of programming available through
the service. They wanted to see quality programming (such as the detective series
Inspector Morse); "I mean that's the thing if there were things like a selection of
Morses, Taggarts and things like and you can just go in select one of that well I mean
that would be brilliant. I mean that's better than having Cable TV if you can watch
what you want when you want it."

They had not even realised the on-line market research and there was no mention of
web access, nor educational programs.

i-Tv was viewed as making "the video shops redundant" (woman) "wipe them out
(man)". But the trial was recognised as having to illustrate its potential: "But you
really need to see what they can offer as a package. It's not like you know you got
biscuits it's not like you can have a nibble and you don't know what else is inside you
know. You have got to see . . . You have got to eat the whole bit really and then you
can - I mean if it was as it is now and I had option to buy it - I wouldn't buy it."
However, he followed up by saying "If the package improved, the selection
improved and then it would be a real - or we could then afford to dump Cable and
take out interactive TV. . . I wouldn't even need to be pushed on it. I'd get rid of
Cable."

It was clear that entertainment would lead the field for pay i-Tv; "I don't think it
would be worth spending the money to have interactive TV just to save you going to
Tesco's" they considered that cable as it is cost a lot of money. They anticipated that
i-Tv could be prohibitive in cost "You think how much pay off to Cable in a month.
If you have a full package with Cable that's £32 a month. It's probably more than
that now - I think it's about £35 a month . . . mean if you then paid out for interactive
TV on top - that's a lot money. We think you can then pay it like we do it - rental for

436
a TV and video - I mean before you pay for interactive TV it's about £60." The man
still agreed that cable would go in favour of i-Tv, for comparative scope of
programming "Cable had the same and interactive had the same but with the same
package but with a facility that you can use on interactive - the selection programme
- you know being able to select what you want when you want - then Cable would
go."

In terms of usability it presented no obvious problems apart for the issue of slowness.
The woman alluded to an awareness of the problems of shared viewing experiences
of interactive media. She felt it also strange to listen to the radio through her
television, "It seems to be a waste of a screen really doesn't it?"

The man had shown friends the system, "we were showing our friends it yesterday
and they think it's marvellous. Marvellous concept." He felt that the banking service
would be useful. "because there is times we want to transfer money into our other
accounts and we wouldn't have to go down to the bank to do it. Because we both
have a joint account and Tracy has her own account and sometimes we switch money
there to pay bills you see. I mean we wouldn't have to go down and that would be
ideal." . However, they were not Nat West customers.

They both felt that i-Tv would be a desirable service, if only with a greater selection
of programmes. The woman made reference to satellite and cable when they first
arrived in the market, "You didn't know what that was going to involve." Apparently
when Cambridge Cable first came out it was going to offer pay-per-view, but this
had still not materialised.

CASE STUDY 5 - F010


Household 5 was a family home. The mother and father were present. There were
two (male) children. They are a one set home. The man considered himself a TV
addict, much more than his wife. They consciously try and "limit the amount of
hours the TV is on. I am one of these people who come and turn the telly on and go
off and do something and sort of snatch part of it but you tend to find with small
children if you put the TV on they will sit in front of it and not do anything else so
we tend to restrict in what they can watch other rather restricting ourselves what we
watch within the actual hours." The mother was also vocal regarding the children's
viewing "They are only allowed to watch an hour - an hour of TV at night time and
that's it - nothing in the morning." The mother was very selective on what she allows
to be viewed as well, viewing that cable tended to show more 'violence' than
traditional broadcast channels.

The father felt that i-Tv would serve to economise on the subscription to the movie
channel. He suggested that i-Tv has impacted on their TV viewing when they first
had it. They had explored it to find out what was on; "I was quite disappointed
actually that once you had actually gone through it and watched everything . . .there
were no new services come up - there is all these great ideas - I know it is only a trial
but I don't feel a great deal has happened within the year or so."

They were not Nat West customers, and had "so many accounts and everything sort

437
of it's too much hassle to change [banks] . . . with all direct debits it a lot of hassle to
change on the direct debits." However, they remained interested in what Nat West
had to offer through theirs. "You can pay your direct debit bills so that would have
been something that we would have used because it is a lot easier than having to
stand in a Post Office and queue." News was of limited use due to the restrictions of
content. The husband added "There was talk of Tesco's doing their wine-list or
something I believe and buying (couldn't make out) - I don't think that has happened
yet has it?"

Advertising was seen as a way in which "they would have more revenue coming in
and maybe they are looking to do more research into the Internet, I mean into the
interactive TV. . . There is nothing wrong with advertising if you are watching a
particular topic or whatever - I don't see any reason why you shouldn't have adverts
on that relates to that topic. Probably be quite happy to watch that. But I wouldn't
like to see normal sort of rubbish that you get on mainstream TV where it is cars and
drinks and what have yous." Generally, they felt that advertising should not intrude,
but should rather be kept to a separate advertising, commercial and retail section.

They were generally amenable to market research type surveys through the system:
"I mean if somebody who had made (?) sort of food or washing powder or whatever
was to do like you said want to know what your personal habits were and what sort
of washing powder you wanted to buy for whatever reason and in return for doing it
you perhaps got a coupon through the door or something - then yeah we would do
that wouldn't we? . . . If the adverts are grouped together, certainly I know you will
get fed up with new car adverts and that sort of stuff, but I you were thinking of
buying a new car then it would be nice to think you could just punch something up
on your... and you would get a selection of car adverts or something with information
where you could write off you know to get more information" They would value
having more factual information about a product.

Access to the Internet was recognised as severely limited by not having a keyboard.
The Cambridge web site which is the one they access news through. Again a
keyboard was cited as being of use to access the service further. The wife however
had not used the web service at all, but nevertheless seen it as possession as
desirable. They lacked the direct comparison of the web service against the
performance of a PC; "It's very slow loading up - quite a lot of the things - I know
there was a lot of the picture and stuff is does take a long time to come through. I
don't know with the normal computer takes as long to load as it does on here or if it
is the way it has been presented here it is taking longer."

The mother saw that the system was "not very children friendly. I mean Tom is used
to using a keyboard at school - he is used to using computer - so that's not like an
alien to him." In respect to education the mother, felt that the programmes used for
arithmetic and maths had obviously been bought from the US or Australia as the
accents were not English. She felt this would confuse her children. Again they felt
the choice available was limited (maths, English and science), they felt he should be
able to do geography.

438
The father also suggested that the remote control was not very "user-friendly," this
was agreed upon by the mother. The father saw that "The way the buttons were you
have to be very precise to actually get a response from the TV in relation to it. It's
not like an ordinary video or cable where you just sort of push it in - doesn't have to
be directly pointed at the TV - it's very sensitive as to how it works or not." In
operation they found that it was slow to start up. They also made mention of the way
in which it kept shutting down "or suddenly switches off." They felt as if it was
getting better.

They did not use the radio interactive service, mainly due to it not representing the
stations that they listen to. Any radio listening was a "kitchen" or "upstairs"
(bedroom?) based activity; "used for food making, tea on or washing up whatever
you tend to be doing." or when "getting ready in the morning."

The box had also functioned as a talking point when they had guests: "You had
friend round and you had friends round and they said what's that box there and she
explained what they stick a code number in and people will come and sit here and
play with it and go through all the things of course. So it has not just been exposed
to us it has been exposed to friends, family you know, other neighbours have been in
the habit of coming in playing with it so it has had quite an audience hasn't it."
People's reaction to it had been "quite positive." This was put to the fact that they had
"genuine" computers and were web literate. However it was felt that if they actually
lived with the machine the reaction would be "similar" to their own (critical?).

This opened up a line of thinking regarding the diffusion of domestic media


technologies via informal networks: "Well we were keen because when you hear this
idea it is like - I can remember a video first coming out about sort of 1980 and you
wanted someone that had got a video - you went round their house to see what they
had. It was when we first had interactive games and that. We had the original space
invaders - it was something really new and it was exciting - it didn't do a great deal
but it was something different and it's the same thing here. You know you think this
is novel technology now in 15 year's time we will look back and every home will
have one and we were first to actually try it out. Well that's probably going to be the
case probably in 15 year's time or whatever it will be everywhere but at the moment
at the stage it's working at isn't enough to sustain interest."

They would not subscribe to i-Tv in its present state, although the wife thought it
"inevitable" that it would be a part of the future. However, they felt that "Whether
you would actually buy a set-top box or it would be in conjunction with someone like
Cambridge Cable where it was provided as a separate service you could access that
way I am not sure."

CASE STUDY 6 - F011


The next household comprised of a mother and her daughter. The father and a 17 yrs
old son were not present. They also take in boarders (students). There were multiple
televisions. There was one in the living room, and four in various other rooms. The
living room is the communal television. The mother hardly watches television at all,

439
her daughter "sometimes." Viewing was dependent on "who is in and who is
around." They very rarely watched it as a family. There is some disagreement over
who watches and who does not between the daughter and son. This can be specific
(i.e. he prefers sports and does not like the daughters preferences for soaps and chat
shows) or unspecified (any video she has chosen).

They did not watch much satellite channels except for the daughter who watched
MTV. The girl was very strategic with her viewing and accessed a guide, but at times
when viewing non-directively she would "go from [channel] 1 downwards." The
mother only viewed news and occasionally "old films, really old films." The mother
was particularly scathing of much TV content. She never hired videos, but the
daughter hired them "every few weeks." They had only acquired the video machine
in the last few months and were "still getting used to it."

Neither the mother, nor daughter had much to comment on the use of the system, as
the main user was the son who could not be present at the interview. However worth
noting was that it was the son and daughter had advocated joining the trial. The
mother had actively resisted, and there was some contention regarding the father's
point of view (the mother saying that he was resistant as well, concerned about how
much of the television the children would watch; the daughter saying he embraced
the idea of having it for the "sports and stuff"). There was no evidence suggested that
the father actually used the system.

The mother reticence was explained by her wider beliefs of the de-huminisation of
social interactions through screen-based activities: "No it's not technology. It's
something about the ability to lose human contact and if you really wanted to you
could actually ultimately, I suppose, do everything in front of the screen and never
talk to another human being. He [the son] started communicating with somebody on
the computer at school and they arranged to do some swaps of things and work
together or something like that and it was how shall I recognise you, you know,
where shall we meet da, de, da, de, da and when it actually came to it they found that
they had been actually sitting next to each other for the whole term and had never
spoken to each other."

CASE STUDY 7 - F002


The next household consisted of was a one set home, with a video. They were cable
subscribers. They rented on average one or two videos a week. They recorded
programmes often. They preferred films, news and documentaries, but did not like
games shows. The woman saw that TV was used only "sometimes" in the morning or
lunch time. The man saw that due to work commitments they often had to work in
the evening and the television would be switched on around 9 p.m., for "an hour or
so before going to bed." At weekends, they often invite neighbours in to a "sort of
television session" where they will all watch a pre-recorded tape.

The main motivation for entering the trial was that "you would be able to choose to
watch the programmes that you wanted to watch, when you wanted to watch them." I
mean the thing that I was personally interested in was the Tesco supermarket

440
shopping . providing that having ordered your purchases you can receive them
reasonably quickly. I mean if you had to buy it today and then be around tomorrow
at a certain time and say the delivery van turns up any time tomorrow morning that
wouldn't be useful at all. The man added: "First thing is I don't like going shopping to
supermarkets . . . The idea is that shopping is a pain and they should be able to make
it easier and faster." He then stated that on the service you would need to have "to
have pictures of everything," he wife retorted that she did not "know how they are
going to organise that."

They had not explored every aspect of the system, education and the web were not
explored.

The banking option was interesting but they did not have a Nat West account,
however he remained open regarding opening an account if real benefit was obvious
through the functionality and service; "I looked at it to see what you could do and the
feature that seemed to be interesting to me was [that] you could automatically pay
your bills rather than send a cheque or go to the bank or set up a standing order. . . If
my bank was on I would use it . . [if] it was interesting enough to use it if I had it but
not so much to go to the trouble of opening a NatWest account because you the
longer you are in the bank then your credit record is OK. It is an awful fuss to
change a bank account."

The man expressed that he did expect for CDs and music to be more available on-
line "rather than it actually being just a selection of BBC radio recordings of
programmes. We don't listen to the radio at home I only listen to it in the car. Well
that's actually not true we do listen to the local stations at home quite often. We only
listen to specific programmes."

They experienced little problems operating the system "once you got the hang of it."
They had noted that the 'film' slows down occasionally and it is not a perfect picture.
These were the only ones remarking on the quality of the MPEG encoding. They also
remarked on the audio quality "I get the impression there is a loss of sound quality. I
mean we have just got a hi-fi which we can actually connect to the video so you can
connect to there on the interactive and you can play the hi-fi through that. We hadn't
tried it out on the radio to see what we had got very good stereo so. An it feels a bit
strange to see the television because when you switch on the radio you get a little
picture of sky . . . A waste to have the television on just for the radio." Promoted to
how interactive impacted TV viewing the man replied "I used to come out of garden
and it's nearly time for the news and you are kind of bound by the clock but now I
don't bother I just come home and switch on and you have . . . The interactive thing
and I can watch it when I want to . . . it has given me more control over my
television."

The functionality of being able to fast forward and reveres was consider useful: "the
video facilities that is reverse and pause - that is good." They also referred to the
news updating. Apparently it was updated every week day, but not at the weekends.
The last recordings were on Thursday, "so on Sunday you still get Thursday's
programmes." It was also noted that the programmes were BBC orientated while it

441
used to be ITV. This was important because the man "It used to be but I personally
like watching ITN. It depends some ITN news programmes are better than BBC1 so
I watch News at Ten as opposed to the 6 o'clock news. I watch the news at 5.40 on
ITN. I think that is better than BBC. I think at the moment it is just BBC that it on
here."

Regardless of his reservations the man still thought that i-Tv was part of the future.
"Yes, I would pay money to have one if I could watch all the programmes that I
watch when I want to. We don't at the moment pay for satellite television." They had
taken on cable as part of the trial, but did not subscribe to movies; "I mean those are
the ones that are worth paying for but we haven't bothered paying for them because
rather than watch films on cable television we would rather pay to get a video out.
So the only reason for getting Sky Movies is because you get the latest releases but
they are available on video so why pay 20 pounds a month to see 50 films of which
you might want to watch three and you could pay whatever it is, 6 pounds to get all
three from the video club."

The main advantage was "Apart from watching programmes when you want to I
think high street shopping, banking and all the sort of things that are very difficult to
do outside normal working hours."

CASE STUDY 8 - F025


This household comprised of a couple - no children, the wife was absent from the
first part of the interview. They were a 2 television household. With the main living
room set providing the cable service which was provided as part of the trial. They
never rented pre-recorded videos. The wife was an "addict" of the news. The man
preferred comedies, and also liked films, which his wife did not like. He had been
watching these on cable. They also like nature programmes. He saw that "In the
summer months you get more or less distracted from television." He remarked how
they had been connected to the Internet: "I am not able to make full use of it I am
afraid but it's... lucky to be able to get that I suppose. But there again I think we are
privileged in being able to be the first people in the world to be able to get it through
our television as opposed to a modem."

In respect to his anticipation of what i-Tv would be like he was unsure: "I didn't even
know the meaning of the word interactive. People kept spouting this out and I
seemed to be asking what does the interactive mean. But I know now - it's something
which you can react on." He had glanced superficially at services such as education,
and being a Nat West customer he found the banking service of some interest and
use: "I couldn't see how you get cash out of the bank through the cable but apart from
that I can see it did function quite well. Quite often, almost every day I used to tune
in to see what my balance of account was . . . It was bang up to date. You get it right
up to the previous day and I shall miss that really. Because getting your statement of
account only twice a month - it's a bit outdated. I have been looking forward to
getting home shopping. But that hasn't been forthcoming. I am wondering now
whether it ever will be." He deduced that the withdrawal of the service was an
indication that home banking was becoming defunct: "As home banking is becoming
defunct - that hasn't proved successful."

442
With respect to the radio he expressed some interest but was discerning regarding the
availability and sound quality of the radio via the television: "I rather tended to think
there were a bit superficial. You see you don't get the quality of reception that you
get on the hi-fi and I tuned in to the radio programme, the music programmes,
particularly the middle of the road music, not classical, not pop.

He felt that it was easy to use, "once you get a hang of it." He was intrigued the
Internet: "It's, as I say, that has revived an interest in the thing as a whole and I
should like to be able to use it to the full advantage which I hope possibly only to
after attending the open evening in September.

He felt that i-Tv would be part of the future; "Yes I do. When we went to the open
evening last autumn I think it was, they showed the system they would use if, for
instance, you want to book up to see a play at the theatre. They have got a plan of
the seats and you were able to choose the seats you wanted and sent it through the
line and you would be able to pick up your tickets. I don't know how you would pay
for them. Presumably you only got them. But that sort of thing seemed very useful
to me and it hasn't come on-line yet but I think it has a future. Certainly has a
future."

CASE STUDY 9 - F040


Household Tape 9 was a family home. The interviewees were a father and daughter.
They were a multiple set family - a television in the parent's bedroom, one in the
children's room, one in the kitchen and one in the lounge, which was the main the set
and the one which was connected to the STB. There were two videos one in the
parents room and one in the living room. They were used mainly for pre-recorded
videos, and were used to record very rarely. They perhaps recorded material two or
three videos every two or three weeks. This was mainly by setting the timer. It was
reckoned that the STB should be connected to the kids TV as their set was watched
more than the parents. They indicated that they had the movie channel, but apart
from that they still watched "BBC1, BBC2, ITV and Channel 4 and rarely do we
watch Sky, Sky 1."

The children mostly watched the cartoon channel. The father was doing an Open
University course so he did not "get a lot of chance to watch TV. He was also a
councillor and was interested in various levels of political activities: "Yeah I am very
interested in international politics and then national and then local."

The household had been featured on the cover of the Acorn User Magazine and
interviewed. They had a computer at home and it had a modem. They had a
connection to the Internet. Questioned on how he found it to use he reiterated that he
barely used it: "I have barely used it because you can't get a lot out of it anyway from
the access that they give you." He felt that an average person would not be interested
in what the web had to offer: "you get bored after about 10 minutes, you know,
bringing it up and reading the television screen . . . the average person isn't going to
be interested scanning through lots of pages of crap basically - that's what it is you

443
know. I want to see something moving in front of me and something happening.
That kind of thing."

He was not attracted at all by shopping on-line: "I think if I - all right take me for
instance - I don't shop anyway. My wife does all the shopping - very rarely do I go
down the shop . . . she goes shopping for clothes. I would doubt - she's had
catalogues here which is basically the same sort of thing isn't it - I would doubt very,
very much that she would throw away actually going down into town to have a look
and try the gowns and whatever it is for like looking at it on a screen or whatever."

He saw that having programme-on-demand would have a major influence on the way
in which they watched television: "I would say it would have a big influence on us if
that was the case. Well then we would choose movies from them. We would choose
movies and soaps and documentaries and educational stuff from them then . . .I
mean I am doing, as I say, an Open University degree and they have got some small
bits of Open University on it and that would be nice if I could call up bits from the
Open University to watch - that would be absolutely marvellous you know. To sit
down and have a catalogue of what you can actually see." The man was very
attracted to being able to access new archives, his wife was less excited by the
proposition, but was enticed when it was suggested that she could "dial a topic I
suppose," such as gardening.

He felt that interactive adverts would be " . . . fashionable. It will be very


fashionable. I would expect on-line to, when they market, when they sort of market
their programmes, or when they sell their programmes, the advertising would come
with it and it would be up to the advertisers to make their part of it interesting. His
wife said she thought it was to be paid for by subscription, she was surprised that
there would be adverts on top of that. The father felt that he would expect adverts
above and beyond subscription; "you would have situations where there would be
certain programmes where they would have to make a charge, depends on how good
they are. You know there might be some really good plays for instance that had been
on, that kind of thing."

The father had some views on on-line questionnaires "If I was watching it, I like the
idea, I think you have got to, people don't like to hang about too much do they, they
will sit and watch something and then sort of go away from it. It might be an idea to
have maybe one or two questions, not go into any great detail, but I mean you people
are supposed to know how to work all this and that.

The man had a Nat West account but felt that: "Not really I don't think it really
interacts with television properly. The service which is, I found, most useful to me
was being in the home banking but I heard today that they were discontinuing that."

Asked what triggers him to use the systems he replied that "Just sense of duty I think
to earn my cable television."

CASE STUDY 10 - F071

444
Household was a family household, a mother and father, a young daughter. The
mother had taken up media studies the year before. The interviewee is a mother.
Television viewing is usually in the evenings, however she watches news at
lunchtimes. They have a video which is broken at the moment. They have not been
encouraged to fix it due to receiving the movies option as part of the trial: "we have
no need to get the videos out but yes we would do really which is why I would be
really keen for the video on demand to actually be up and working because I think I
would definitely use that." The main channel viewed was SKY1. The woman prefers
American dramas, while the father prefers action films. She reckoned that they watch
a couple of films a week.

Television was used for relaxation and "to keep the kids quiet." One of the major
pitfalls however was that "the good programmes tend to be on at a time when I can't
watch them . . . I really don't sit down till about 9 o'clock. Some of the situation
comedies are on like half seven, eight o'clock which I probably enjoy more. I think
they should definitely go ahead and move the 10 o'clock News." She said that she did
not "get a chance to sit down and read," and preferred to sit and watch the news. Her
partner related that she gets tired if she sits down with a book.

Her original motivation to participate in the trial was to get the free movie channel,
however they were also interested in finding out more about it "as well." Her
participation in a media studies course also prompted her interest. One of her essays
had been on new technologies and TVs. Her anticipations of the system compared to
using was that they expected to do more with it, "That's what we were led to believe
anyway . . . I was expecting a video. "If the home banking was up and running I
would definitely use that." She banked with Barclays. Nevertheless, since she ran a
business from home "it would be a good thing to have that (banking services)
through the television . . . I'm a fitness instructor so I mean it's not a vast amount of
money I'm dealing with but it's enough to bank once a week and check-up on - check
balances and things."

She felt that shopping would be an asset to her lifestyle: "I mean I have spoken to
friends about it and think a lot of people's initial reaction is 'I really enjoy going
round the shops and looking at it all but for the once a week chore shopping, you
know the things that you have to get every week, to take that out of it - yes.'"

In respect to education she felt that response and speed was important: "I've got a PC
upstairs that's far quicker - yet when it gets on to that level when you really have got
the combination of the computer and a television together then that would be good
but at the moment the kids will have their files of the stories that they have got on
there but once you have done it a couple of times.

They had been intrigued when the system went on-line, "looked at everything." They
had only accessed the local news in the first sessions, as well as fishing programmes
"it's like watching paint dry isn't it?" The radio was only used once as well. In respect
to usability of the system "No it was really user friendly. The length of time perhaps
that you have to wait being used to computers that are a lot quicker." She thought
that it took as long as teletext to access as page.

445
They had not "really used" the interactive system. "I can see the potential but as it
stands at the moment, the trial, we've probably used as much as we are ever going to
use it because it hasn't really changed that much. The refresh of programming and
content was a major issue to whether they thought the novelty of the system would
wear off "I mean on how often they change the games on there. I mean obviously if
they kept the same games - yes they would get fed up with it but I mean if they
changed them.

CASE STUDY 11 - F028


The final household was NOP11. This was a couple. The husband watched mainly
sports and news. He was also into drama. They were a 2 set home, the main set in the
living room and another in the bedroom. There was no change in their viewing
preferences, bar that of accessing more movies with the option that came with the i-
Tv system. They would not have taken it independently as it was too much money.
They watch SKY Movies "perhaps twice a week."

They tended to watch late at night "mostly what we watch on the Movie Channel is
late night anyway, after 10 o'clock. The man had "quite a collection" of videos which
he was keeping: "Most of those were westerns actually which I collected for my
own, you know. He did not watch a lot of i-Tv "because I watched the programmes I
was interested in and that is, to be quite blunt with you, I am not very happy with it
over that . . .There are 6 fishing programmes on there - John Wilson, Go Fishing and
the same programmes have been on . . . You can't keep watching things over and
over and over again."

The main motivation for joining the trial was that "they sent letters round asking they
wanted people in certain areas if they would do it and well they wanted people who
have got children or grandchildren etc., so that's why we did it. We thought it would
be something to try you know." His wife said it "sounded interesting . . .it said that if
you want to watch a certain film which had been on and you had missed it you could
go back and watch it - no . . . Yesterday's news is on but for Godsake who wants to
watch yesterday's news when you have seen it. Radio programmes - if I wanted to
listen to radio programmes I should listen to it when it was on. I mean that's how I
look at it." In respect to the banking he had never successfully accessed that due to
not receiving a pin number. And the children's programmes "I think they are little bit
- one or two of those are a little bit complicated for the children because you couldn't
leave them to watch it."

The man stated he found it awkward to work, but when promoted by the interviewer
whether it was easy or awkward he replied: "No it's not too bad. It's quite simple
really. Just to watch basic programmes, change from one programme to another it's
quite easy." He now only accessed the system to see if they had changed the fishing
programmes. In his opinion they should refresh programmes "at least once a
week . . . . Because it gives everybody then a chance to see those who wouldn't have
the same leisure time you know. I think it's a good thing for education. I think if it
got into schools for education of the school that sort of thing. Basically at the
moment, a person like myself, it's not use to me really whatsoever unless they are

446
going to change the programmes."

The idea of on-line catalogue-style shopping was not of interest to the man, he also
viewed that the weekly supermarket shop would be of use to his wife: "she definitely
wouldn't buy weekly stuff out of it." I mean you get QVC on that, you know, but I
occasionally watch that you know if there is nothing else on or if I see something on
that catches my eye but I wouldn't ever buy anything from it. He felt that were not
enough coverage on local matters.: "there is a lot goes on in Cambridge that people
even in Cambridge don't know about. There is a lot goes on in your locality that
people in your locality don't know about. I mean I know this for facts in my country
and western dances at the Arbury Community Centre there are people living 100
yards away didn't know they were on . . .Everybody likes learning about their own
locality better than they do anything else so you know. What's going on in the next-
door-neighbours is much more interesting than what's going on in Arabia or
somewhere like that I can assure. It's true - it is true that." He felt that an archive-
style news facility would be of use especially for children "I think that could be OK
because I mean if anybody was, specially for children, that would be a very good
thing for kids at schools."

On the subject of advertising he felt that there would be a union of both subscription
and advertisements which paid for the service. He was attracted by targeted adverts,
but not without reservation: "the problem is the car and if you went off fishing you
would be getting the same old stuff all the while wouldn't you. No I think you would
have to have what they throw at you."

He felt that the system had a future; "I don't think it will be a damp squib . . . They
have got to alter it a little bit but quite a lot of it actually. There is some of it, I think
I can definitely see it being part of the future, definitely."

He was in favour of on-line questionnaires: "Oh yeah that would be easy. Good idea
actually - I think they would get more feedback that way than any way. It would be
instant wouldn't it. You would get one or two that would. The odd ladies who have
got nothing else to do but, you know bored with housework . . . having a quiz survey
about what you like about this breakfast food etc. and what you... I don't think people
would do them unless it's a bored housewife who has got nothing else to do."
However when promoted what kind of reward he would consider reasonable for
filling in a 5 minute questionnaire he replied that he would not probably do it at all.

Appendix 2. Two Trials

In the sparse literature on the subject of i-Tv, two trials that are widely cited are the
Warner-Amex QUBE trial conducted in the late 1970s and early 1980s; and the Time
–Warner Full Service Network (FSN) trial conducted over the period 1993-1997. 471
(i.e. DeFleur and Ball-Rokeach, 1989; Neuman, 1991; and Carey, 1994) Both came
to show failure in pre-empting a variety of factors, human, ergonomic, content, as
471
The FSN trial ran almost concurrently with the Cambridge i-Tv Trial outlined in this book

447
well as technology issues. However at their onset there was a distinct belief, shown
by the faith in the significant investment each drew. It was held that they were indeed
a proper representation of the technical capabilities and trends of the time,
technologies and services which would appeal to latent consumer demand, and that
each were represented a route to creating viable commercial propositions.

QUBE

The Warner Communications and Amex Cable QUBE system was launched in
Columbus, Ohio, 1st December, 1977. The QUBE system offered subscribers pay-
per-view movies, and being able to take part in on-line polls and feedback. However,
QUBE as a commercial proposition failed. No more than 20 per cent of subscribers
used the interactive programmes, and the trial folded with reported loses of $30
million (DeFleur and Ball-Rokeach, 1989). It was found that those users who did
interact, would only do so for a few weeks and then return to more (traditional)
passive modes of viewing. One of the reasons which has been cited for this was that:

"The behavior of the average television viewer is largely culturally enforced.


It should not be surprising if most households react hesitantly to two-way
television. The preceding 40 years of experience have engendered a different
set of expectations." (Neuman: p.113)

In other words, the suggestion is that the QUBE trialists found it difficult to react due
to their socialisation into using passive forms and formats of media. It was suggested
that QUBE was so expensive that Warner kept the true cost secret even within its
executive ranks. Carey (1994) cites the QUBE terminal in homes costing
approximately $200, or four times the cost of standard cable decoders at that time.
QUBE equipment at the cable head end added approximately $2 to $3 million in
plant costs:
"In addition, it was both expensive and difficult to maintain the upstream or
return data path from homes, which introduced reliability problems for the
interactive service. Production costs and interactive program design presented
further obstacles. Compared to broadcast network programming, budgets for
QUBE programs were very low. "Interactivity" with low production values
could not compete with network programming. Moreover, because they had
little previous experience in designing interactive programs, QUBE producers
were starting from scratch." (Carey, ibid)

Carey is suggesting that the economics of interactivity made for a lower quality of
experience compared with traditional broadcasting programming. Also of significant
note was that Warner-Amex did not know how interactive television would work,
and hired an executive to "dream up a variety of new applications." (Neuman, 1991:
p.111) Neuman also notes that interactive programming did not generate enough
revenue to support its locally produced two-way information. Nevertheless it did
perform as a marketing tool distinguishing its service from other cable operators
bidding at the time for big-city franchises. Also, QUBE whetted the interest of
American Express Co., which bought 50 percent of the Warner cable business in
1979. American Express was eager to tap the potential of electronic commerce, like

448
banking and shopping, from the home. Warner- Amex halted QUBE in 1984 in large
part because of the losses its parent company, Warner Communications, incurred
with its Atari subsidiary, when video game sales fell well below forecasts.

Carey cites that QUBE demonstrated that, if the cost of promoting and processing
pay-per-view orders were reduced, then pay-per-view programming was potentially
viable. It also introduced a number of interactive formats that have since evolved and
been adopted as components in cable and broadcast programming. Its interactive
programming included American football games play-offs; on-line polling; shopping
and interactive video games. In fact, both MTV and Nickelodeon trace their roots to
QUBE's experiments.
" . . . interactive media must be developed in a viable economic and technical
context. Even with these elements in place, producers must learn to create
with the new medium, and audiences should not be expected to change their
media habits overnight." (Carey, ibid.)

QUBE was followed in the early 1980s by two high-profile videotext projects,
Viewtron from Knight-Ridder and Gateway from Times-Mirror Co. Although all
three are generally referred to as "failures," each led to high-profile services that are
now extant in traditional and new media. And they were the precursors of the on-line
services Prodigy, CompuServe and America Online, which in turn preceded the
Internet boom.

Also, the emergence of the eighties video game playing generation attitudes and
behavioural responses to interactivity may be quite different. The video games
playing generation are the first people en mass to interact with information held
within the rectangular domain of the screen. It may be assumed that they differ from
their older counterparts in that they have been already socialised into interaction.

Full Service Network

In a giant ballroom of the Sheraton Orlando North Hotel on December 14, 1994,
Warner (this time as Time-Warner) launched a further attempt at creating interactive
television - the Full Service Network. Its stated objectives were to fine tune the
technology for a high-speed digital system; to learn about consumer preferences; and
to provide a platform for consumer development and for testing products. The FSN
was to exploit recent developments in digital technology to offer a range of services
'on-demand' including the delivery of video. The trial was originally aimed at lasting
18 months, and its target populations were 4,000 subscribers (1994); 500,000
subscribers (by 1996); 750,000 subscribers (by 1998). It ended with the firm
claiming success but being, as they were about QUBE, extremely conservative in
publishing their findings.472 Total cost of the trial to the company is estimated by
some at up to $100m (which overshadows Acorn Online Media's start-up costs of
£1.84m and Acorn's cited intention to invest £13m in the trial over the period 1994 to
472
Information regarding the performance of QUBE only ‘leaked’ out when former employees
informed on the trial’s success (Neuman, 1991).

449
1997).

Six months later, in June 1995 the FSN was in fewer than 36 homes. Time-Warner
then took it upon themselves to accelerate installation towards the end of 1995. They
did this by implementing large installation crews and dozens of marketing
representatives and FSN executives pitching in as customer "trainers" before the
company announced the completion of 4,000 installations.

It came to be found however, that customers' willingness to pay for on-demand


movies was way below a level that would make the construction of a broadband
network anything like a sound business proposition.473 It existed within a market
dominated by a range of films available on the existing cable and satellite channels.
There was also the imminent promise of even more channels to come with digital
TV. There was also the cheap and easy availability of video rental. In addition the
FSN needed to equip subscribers' homes with a $5000 Silicon Graphics computer as
a STB – which had to given away free to subscribers along with other peripheral
devices. The trial ended in 1997 with wide suggestion that the key problem with
digital interactive services such as video-on-demand was that like its analogue
predecessor, it lacked economical viability.

Appendix 3. Sys-logging

473
ACTS BULLETIN http://www.uk.infowin.org/ACTS/ANALYSYS/GENERAL/ACTSBULL/may-
97.htm#time.

450
451
452

You might also like