Download as pdf or txt
Download as pdf or txt
You are on page 1of 29

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/314128243

Ontological distinctions between hardware and software

Article  in  Applied Ontology · February 2017


DOI: 10.3233/AO-170175

CITATIONS READS

6 11,573

1 author:

William D. Duncan
University of Florida
43 PUBLICATIONS   220 CITATIONS   

SEE PROFILE

All content following this page was uploaded by William D. Duncan on 12 December 2017.

The user has requested enhancement of the downloaded file.


Applied Ontology 12 (2017) 5–32 5
DOI 10.3233/AO-170175
IOS Press

Ontological distinctions between hardware


and software
William D. Duncan a,b
a
Roswell Park Cancer Institute, Buffalo, NY, 14203, USA
b
National Center for Ontological Research, University at Buffalo, Buffalo, NY 14214, USA
E-mail: wdduncan@gmail.com

Abstract. There are a wide range of positions regarding the ontological nature of computer hardware and software. Moor [The

PY
British Journal for the Philosophy of Science 29 (1978), 213–222] argues that there is no significant ontological distinction
between the two; Suber [The Journal of Speculative Philosophy 2 (1988), 89–119] argues that computer hardware is a kind of
software; Colburn [The Monist (1999), 3–19] defines software as a special kind of entity he calls “concrete abstraction”, and
Turner [Minds and Machines 21 (2011), 135–152] classifies software as a specification. In this paper, I examine the positions
of each philosopher, and based on this examination, define ontological categories that account for computer hardware and
O
software. As a result, clear distinctions emerge between computer hardware and software: A software program is a specification
that consists of one or more programming language instructions and whose concretization is embodied by an artifact that is
designed so that a physical machine may read the concretized instructions, whereas hardware is an artifact whose functions are
C
realized in processes that directly or indirectly bring about the result of some calculation.
Keywords: Ontology, philosophy of computer science, computer hardware, software
Accepted by: Nicola Guarino
R
O

1. Introduction
TH

The purpose of this paper is to define ontological categories that account for computer hardware1 and
software. However, in order to define these categories, we must first understand what differences, if any,
AU

exist between hardware and software. We take hardware and software to be distinct kinds of things, but
giving precise criteria for how to distinguish them is more difficult than you might think. Consider the
following four naïve criteria (of my own making) for distinguishing hardware and software:
1. Software is easily modifiable, whereas hardware is not.
2. Software is portable, whereas hardware is not.
3. Hardware executes the instructions contained in software.
4. Software consists of sets (or sequences) of instructions that are executed by a computer.
Each criterion has problems. Criterion 1 does not account for the fact that with the proper equipment
and expertise, hardware is modifiable. Moreover, the further qualification that software is intended to
be modified (whereas hardware is not) does not salvage criterion 1, since many commercial software
products are not intended to be modified. As presently stated, criterion 2 also does not work, for many of
the physical components in a computer system can be removed and placed in another computer system.
1 Hereafter, I will often simply use the term “hardware” instead of “computer hardware”.

1570-5838/17/$35.00 © 2017 – IOS Press and the authors. All rights reserved
6 W.D. Duncan / Ontological distinctions between hardware and software

Criteria 3 and 4 seem more promising. But, criterion 3 fails because the purpose of some software
programs, such as the Java JVM, is to execute other software programs. And criterion 4 fails because
many hardware systems have instructions hard-coded in them. Thus, hardware can consist of sets (or
sequences) of instructions as well as software.

1.1. Further criteria for distinguishing hardware from software

The shortcomings of criteria 1–4 prompted me to consult a number of authoritative sources, such as
computer science textbooks and encyclopedias. A full listing of my findings is given in the Appendix.
Since many of them are similar to 1–4, it is not necessary to discuss all of my findings in detail. How-
ever, I did find a few discussions of hardware and software (listed in 5–12 below) that require further
consideration:
5. Software systems are abstract and intangible. They are not constrained by the properties of material,
governed by physical laws, or by manufacturing processes (Sommerville, 2011, p. 4).

PY
6. The hardware of a computer consists of all the electronic components and electromechanical de-
vices that comprise the physical entity of the device (Mano, 1993, p. 2).
7. Hardware requires manufacture; has physical models to use in evaluating design decisions; is com-
posed of parts that wear out; and, because of its physical limitations, has a practical bound on
O
complexity (Blum, 1992, pp. 28–29).
8. Software is an integration of the program code with design and supporting documentations and
C
intermediate work products, such as original requirements, system specification, system design
and decision rationales, code configuration, test cases and results, maintenance mechanisms, and
user manuals (Wang, 2008, p. 15).
R

9. Software is an intellectual artifact that provides a solution for a repeatable computing application,
which enables existing tasks to be done easier, faster, and smarter, or which provides innovative
O

applications for the industries and everyday life (Wang, 2008, p. 15).
10. Software is a specific solution for computing in order to implement a certain architecture and
TH

obtain a set of expected behaviors on a universal computer platform for a required application
(Wang, 2008, p. 17).
11. Hardware is used to refer to the specifics of a machine, including the detailed logic design and the
packaging technology of the machine (Hennessy & Patterson, 2003, p. 9).
AU

12. Hardware and software are logically equivalent (Tanenbaum, 1990, p. 11).
Although some of 5–11 do not explicitly mention a criterion for distinguishing hardware from soft-
ware, it may reasonably be inferred that the proposed criterion does provide a means for distinguishing
the two. For instance, criterion 5 states that software systems are intangible. Since we normally associate
hardware with concrete objects, the intangible nature of software is a plausible means of distinguishing
it from hardware. At a minimum, 5–11 present criteria that may be taken to be an essential property of
either hardware or software, and this raises the question of whether this property is an adequate distin-
guishing criterion. Let us examine each criterion.
In criterion 5, software defined as being abstract and intangible, and (presumably) hardware is not.
This characterization of software, however, raises some puzzling issues. First, there is an issue regarding
the ontological status of those physical objects that we sometimes refer to as software. If software is non-
physical, then how can physical objects such as compact disks, floppy disks, and punch cards also be
regarded as software? Second, there is the issue of determining how non-physical software can interact
W.D. Duncan / Ontological distinctions between hardware and software 7

with physical hardware. A plausible response to these issues is to hold that, although software is non-
physical in nature,2 the instructions of software are contained (or encoded) in some physical medium.
This response, though, relies on the notion of software being composed of instructions, and, as noted in
criteria 3 and 4 above, this is not an adequate distinguishing criterion.
Next, let us consider the physical aspects of hardware discussed in criteria 6 and 7. Criterion 6 identi-
fies hardware as being the physical components of a computing system, but does that imply that software
is non-physical? If the answer to this question is ‘yes’, we run into the problems discussed in criterion 5
above. And if the answer is ‘no’, we again encounter the difficulties in criteria 3 and 4. Thus, criterion 6
is not suitable for distinguishing hardware from software.
Criterion 7 is interesting in that it picks out properties of hardware that are the result of its being
physical in nature. But do these physical properties necessarily belong only to hardware? Software goes
through a manufacturing process when it is distributed in some physical medium (e.g., a DVD), and
such physical media degrade over time. Furthermore, most software is intended to run on some physical
machine, and this places practical limits on software programs. Although it is possible to design software
that cannot be executed on any existing physical machine, this may turn out to be of little practical value,

PY
since we will not be able to empirically implement the software. This situation is similar in many respects
to being able to design hardware that is impossible (given our current level of technology) to build. There
may be theoretical value in engaging such activities, but, in both the hardware and software cases, there
O
are physical limits on what can be implemented.
In criterion 8, we see the idea that software is program code plus a number of other things such as
C
documentation, system specification, user manuals, etc. I have already discussed why computer pro-
grams (understood as sequences of instructions) can also be part of hardware. So, this leaves us with the
question of whether the documentation, system specification, and the like differentiate software from
R

hardware. However, I see no reason why this would be the case, since hardware can also have number
of other supporting documents associated with it.
O

Criteria 9 and 10 express the idea that software is capable of running on multiple machines. But, given
that instructions can also be in hardware (see criteria 3 and 4 above), a more detailed explanation is
TH

needed in order for criteria 9 and 10 to distinguish software from hardware.


According to criterion 11, hardware refers to the technologies, such as the logical design and pack-
aging technology3 , that are used to construct a machine (or class of machines). However, this is not
adequate for distinguishing hardware from software. As already noted, a software instruction may be
AU

hard-coded into a machine, thus making it subject to some of the same technologies as hardware. For
example, software can be distributed on flash drives, which are composed of integrated circuits.
Finally, let us consider criterion 12. By stating that “hardware and software are logically equivalent”,
Tanenbaum means that “[a]ny operation performed by software can also be built directly into the hard-
ware and any instruction executed by the hardware can also be simulated by the software” (Tanenbaum,
1990, p. 11). In other words, any instruction that can be coded as a programming language expression
can in principle be implemented as physical device, and any physical device that carries out a calculation
can in principle be implemented in a software program whose code specifies instructions for carrying
out the operations of the physical device. I agree with Tanenbaum on this point, and many of my criti-

2 It is important to note that there are a number of different senses of what it means for something to be intangible or abstract.
For some examples of this as it relates to software, see Eden and Turner (2007).
3 The packaging technology refers to ways in which the integrated circuits and other components are connected together
(Tummala, 2001, p. 12).
8 W.D. Duncan / Ontological distinctions between hardware and software

cisms of the previous criteria 5–11 have reflected this. Tanenbaum, however, goes on later to make the
following stronger claim:
the boundary between hardware and software is arbitrary and constantly changing. Today’s software
is tomorrow’s hardware, and vice versa4 (p. 12).
This implies that there really is no distinction between hardware and software, and one of the main
points of this paper is that this view is mistaken. Thus, while I agree with Tanenbaum about the logical
equivalence of hardware and software, I disagree that this should be taken to mean that there is no
distinction between hardware and software. The details of my position are explained below.

1.2. Philosophical discussions about hardware and software

It should by now be clear that distinguishing between hardware and software is not a straightfor-
ward task. Naïve criteria, such as criteria 1–4, are easily subject to counterexamples, and more complex

PY
criteria, such as criteria 5–12, are often incomplete. These difficulties raise an interesting philosophical
question: Is there a real distinction between hardware and software? Some philosophers have argued that
there is not a distinction. Moor (1978) argues that the distinction between hardware and software should
O
not be given much ontological significance. Suber (1988) argues that hardware, in fact, is software. Other
philosophers have focused on the role that abstraction and abstract entities play in the understanding and
C
creation of computational artifacts. Colburn (1999) holds that software has both a concrete and an ab-
stract nature, and, thus, is best understood as being what he calls a “concrete abstraction”. Turner (2011)
argues that it is the abstract nature of specifications that are crucial in determining whether something is
R

hardware or software.
For my part, I find Moor’s and Suber’s positions implausible. As will be discussed below, their argu-
O

ments do not play close attention to important metaphysical distinctions, and because of this, they draw
implausible conclusions. Colburn’s and Turner’s positions advance our understanding of computational
TH

artifacts, but in order to rigorously apply them to hardware and software, an account needs to be pro-
vided of what kinds of things concrete abstractions and specifications are. In other words, we need an
ontological theory that (1) shows the deficiencies of Moor’s and Suber’s accounts, and (2) adds clarity
to the insights of Colburn and Turner. In this paper, I provide such a theory, and as a result of this, clear
AU

distinctions emerge between hardware and software: A software program is a specification that consists
of one or more programming language instructions and whose concretization is embodied by an artifact
that is designed so that a physical machine may read the concretized instructions, whereas hardware is
an artifact whose functions are realized in processes that directly or indirectly bring about the result of
some calculation.
Before continuing, it is important to note that the amount of philosophical inquiry devoted to the
natures of hardware and software per se has been limited. Often, discussion of this topic occurs within
the debate of whether the brain is a computer. Thus, while there is a wealth of philosophical literature
concerning the computational nature (or lack thereof) of the brain, little, by comparison, has been written
that directly address the natures of hardware and software.

4 On
page 11 of the same textbook, Tanenbaum contrasts hardware as being tangible and software as being intangible. This
seems contrary to the quoted passage.
W.D. Duncan / Ontological distinctions between hardware and software 9

2. Philosophical theses about hardware and software

2.1. The hardware-software distinction is pragmatic

In his “Three Myths of Computer Science”, Moor (1978) argues that the distinction between hardware
and software should not be given much ontological significance. Instead, he holds that we should take a
“pragmatic view of the software/hardware distinction” (p. 215).5
Moor’s argument is based on two assertions. The first is that a “computer program is a set of instruc-
tions which a computer can follow (or at least there is an acknowledged effective procedure for putting
them into a form which the computer can follow) to perform an activity” (p. 214). The second is that
computer programs are to be understood on two different levels: the symbolic level and the physical level
(p. 213). The symbolic level consists of the symbols used to represent some set of computer instructions.
The physical level consists of the various media on which computer instructions are physically stored
(such as floppy disks, CDs, magnetic tape, and so on). For example, suppose I write a computer program
that sorts a list of integers. This program will consist both of the symbols that represent the various

PY
actions necessary to sort the list, and the medium in which the symbols are stored (or inscribed). It is
important to bear in mind that whether a computer can read my program is not an issue for Moor in
determining whether it is a computer program. I can scribble my program on a napkin or carve it in a
O
tree. As long as my program is written using symbols that can (in principle) be read and executed by a
computer, it is a computer program.
C
When we distinguish between hardware and software, though, we often overlook either the physical
level or symbolic level of computer programs. For instance, in a computer system, software is often
thought of as the part of the system that contains the computer’s programs, whereas hardware is of-
ten “characterized as the physical units making up a computer system” (Chandor et al., 1970, p. 179)
R

(p. 215). In other words, software is often associated with the symbolic level that represents the instruc-
tions of a computer program, and hardware is often associated with the physical components necessary
O

to execute a computer program. This association of software with the symbolic level overlooks the physi-
cal level of computer programs, for it ignores two important facts about computer programs. First, many
TH

early computers were programmed by throwing switches or setting wires. The computer programs of
these early computers were part of the hardware, not separate. Second, modern digital computers usu-
ally store computer programs internally, and these stored computer programs are part of the physical
AU

structure of the hardware (p. 215).


The symbolic level of computer programs, on the other hand, is overlooked when we consider only the
physical level of computer programs. At the physical level, software is taken to be the part of a computer
system we can change. However, this distinction between hardware and software cannot be consistently
maintained. Depending on the context, the part of a computer system that a person can modify may
vary. For example, a person with expertise in circuit design may be able to modify a computer system’s
motherboard, a component we normally consider to be hardware.
Given that computer programs are best understood as having both symbolic and physical levels, Moor
advocates that we view the distinction between software and hardware as ‘pragmatic’:
since programming can occur on many levels, it is useful to understand the software/hardware di-
chotomy as a pragmatic distinction. For a given person and computer system the software will be
those programs which can be run on the computer system and which contain instructions the person
5 Unless otherwise indicated, all references in this section are to Moor (1978).
10 W.D. Duncan / Ontological distinctions between hardware and software

can change, and the hardware will be that part of the computer system which is not software. At one
extreme if at the factory a person who replaces circuits in the computer understands the activity as
giving instructions, then for him a considerable portion of the computer may be software. For the
systems programmer who programs the computer in machine language much of the circuitry will be
hardware. For the average user who programs in an applications language, such as Fortran, Basic,
or Algol, the machine language programs become hardware. For the person running an applications
program an even larger portion of the computer is hardware (p. 215).
In stating that the software/hardware distinction should be understood as pragmatic, Moor is focusing
on the practical activity of computer programming. The ways in which people perform this activity and
the levels at which they perform it are not uniform. According to him, what is considered a computer
instruction and which instructions can be modified is dependent upon the person doing the programming
and the programming being performed. Hence, we cannot clearly distinguish between hardware and soft-
ware, for what counts as hardware for one person may be considered software for another (p. 216). Thus,
Moor concludes that we should not read too much into the software/hardware distinction. Rather, it is

PY
better to understand the hardware/software distinction as a pragmatic matter, for, “[t]his pragmatic view
of the hardware/software distinction makes the distinction both understandable and useful” (p. 215).
2.1.1. Critique of the pragmatic hardware-software distinctionO
Moor’s argument consists of an analysis of computer programs at both the physical and symbolic
level. My discussion of him will proceed in the same manner. First, let us consider the physical level.
C
Moor’s observation that, on the physical level, the ability to modify a computer program is dependent
on the computer programmer and the programming activity is insightful. In certain contexts, the ability
to modify something depends on the person who performs the activity. Consider two homeowners. One
is skilled in the art of home construction. The second is not. For the first homeowner, certain properties
R

of the house may be modifiable, while for the second these properties may be fixed. For example, the
first homeowner may be able to build an addition on to the house. In this particular context, then, the
O

distinction between what is and what is not modifiable depends on the homeowner who is performing
the activity.
TH

Moor, therefore, is correct that for a given computer program some will be able to modify it and others
will not. However, this does not necessitate that what counts on the physical level as hardware or software
is solely dependent on whether a person can modify it. For example, a computer program can be burned
AU

to non-rewritable DVD. My inability to modify this particular computer program (without damaging the
DVD) does not necessitate that it is hardware. I can still perform a number of other activities associated
with software, such as installing it on multiple computer systems. In other words, part of the reason why
the computer program on the DVD is software is because it (the same computer program) can be copied
to multiple computer systems. The inability to modify (a particular copy) it does not necessitate that the
computer program is hardware.
Next, let us consider the symbolic level. Recall, Moor asserts that at the symbolic level of a computer
program what counts as a computer instruction is also dependent upon the practical activity of computer
programming. Thus, depending on the person involved in the activity, a computer program that is con-
sidered software by one person may be considered hardware by another. Again, Moor’s reasoning does
not hold in general. Houses, for example, also have a symbolic level of understanding in the form of
building plans. Building plans may include sections that, like computer programs, consist of instructions
that specify different activities for each person involved in the construction of a house. For example, one
section of a building plan will instruct the electricians how to install the electrical system, and another
W.D. Duncan / Ontological distinctions between hardware and software 11

section of the building plan will provide instructions to the brick masons on how to build the foundation.
However, this does not mean that the distinction between electrical systems and foundations is based
upon the practical activities of the electrician and brick mason. Similarly, although what counts as an
instruction will be different for the factory worker installing circuit boards and the systems programmer,
this may not necessitate that the distinction between a circuit board and some block of computer code is
based solely on what counts as an instruction for each person.
There are a number of responses to my counterexamples. First, it may be argued that I have been too
broad in my examples of house construction. However, I reply that the same criticism can be applied
to Moor’s examples of computer programming. He, after all, includes both the activities of replacing
circuits and typing lines of code under the umbrella of computer programming. Thus, Moor’s description
of computer programming suffers from the same deficiency as my example of house construction. In
order to enquire further, a more detailed description of computer programming is needed, and Moor
does not provide one.
Second, one can respond that my counterexamples are wrong. By comparing house construction to
computer programming, I am comparing apples to oranges. I agree that most, if not all, analogies break

PY
down at a certain point. However, Moor has provided a very general description of the activity of com-
puter programming and the relation of this activity to the creation of hardware and software. Likewise,
I have provided a very general description of the activity of house construction. If my analogy breaks
O
down, it does so in the details. But, again, Moor has not provided adequate details in order to determine
where my analogy breaks down.
C
There are, of course, other responses, but it is not the purpose of this paper to address them all.
Rather, I suggest that the problem with Moor’s position is that in order to accept it, one has to work
out how to fit it into a general account of reality. Moor’s position is motivated by his consideration of
R

three main kinds of things: the symbolic level of computer programs, the physical level of computer
programs, and the activities related to each level. Within this limited ontology, he is not able to find any
O

significant ontological differences between software and hardware. This is an implausible result, and in
my counterexamples, I have argued that when Moor’s reasoning is applied to other aspects of reality it
TH

does not necessitate the same conclusions.

2.2. Hardware is software


AU

In his “What is Software?”, Suber (1988) argues that hardware is software. His argument is summa-
rized as follows:
(P1) Software is a pattern that can be read and executed.
(P2) A pattern can be read and executed if it can in principle satisfy the physical and grammatical
conditions of readability and the requirement of executability.
(P3) All patterns can satisfy the physical and grammatical conditions of readability and the require-
ment of executability.
(P4) All concrete objects display patterns.
(C1) All concrete objects can satisfy the physical and grammatical conditions of readability and the
requirement of executability.
(C2) Therefore, all concrete objects are software.
(P5) Hardware is a concrete object.
(C3) Therefore, hardware is software.
12 W.D. Duncan / Ontological distinctions between hardware and software

Suber’s first premise (P1) is based on his assertion that software, at its most basic level, must be
represented as a pattern. In this assertion, it is important to understand that Suber is using pattern “in
a broad sense to signify any definite structure, not in a narrow sense that requires some recurrence,
regularity, or symmetry” (p. 91).6 Thus, whether in the form of magnetic oxide on a disk, pits and lands
on a DVD, or some other form, it is the pattern that represents the instructions contained in software.
This characterization of software, however, is not adequate for at least three reasons. First, it does not
specify whether software must have a material expression; second, it does not distinguish software from
noise; third, it does not distinguish software from data (p. 93). To address these deficiencies, Suber adds
(P2): the requirement that software “must in principle be capable of meeting the physical and grammat-
ical conditions of readability and the requirement of executability” (p. 101). In adding this requirement,
the first concern (whether software must have a material form) is addressed since the physical condition
of readability requires that the pattern must be in a form that a machine can read.
The second concern (how to distinguish software from noise) is addressed by the grammatical con-
dition of readability and the requirement of executability. Together, these conditions specify that there
must be “certain syntactic structures within the pattern” (p. 98) that can act as instructions to the ma-

PY
chine. This would seem to exclude patterns that have no discernable meaning. However, Suber is quick
to point out that although a pattern may not, at present, have a meaningful interpretation, “[a] very clever
person working backwards from an arbitrary series of bits could create language conventions that would
O
make the string a meaningful program that did something interesting” (p. 93). A pattern is “noisy” rel-
ative to some set of language conventions that the pattern may or may not fit, and since it is always
C
possible to give a pattern a meaningful interpretation, no pattern is noise from all perspectives (p. 94).
All noise is, thus, capable of becoming software. Suber dubs this conclusion the “Noiseless Principle”
(p. 94).
Lastly, the question still remains of how to distinguish software from data. Software gives instructions
R

to a machine, but, in some circumstances, the software itself can be treated as data. Depending on the
circumstance, the same pattern may be software, data, or both. For example, a compiler treats another
O

computer program as data, and the programming language LISP allows for a computer program to treat
itself, or a copy of itself, as data (pp. 107–108). The determination, then, of whether a pattern is software
TH

or data is not due to some intrinsic quality of the pattern. Rather, it is determined by a pattern’s role. In
some cases, the role of a pattern may be as input to software. In other cases, the role of a pattern may be
as software (i.e., as instructions to a machine). As long as the physical and grammatical requirements of
AU

readability are met, a pattern can perform either role. If a machine can read a pattern, the pattern may
be passively treated as data or the pattern may actively read as one or more instructions. It is important
to note that as a consequence of Suber’s argument, data is ultimately a kind of software. Since data can
be read, a language may be constructed in which the patterns present in data act as instructions. In other
words, data, in virtue of having a pattern, meets the requirement of executability.
The third premise (P3) follows from two principles Suber calls the Sensible and Digital Principles.
The first states that “any pattern can be physically embodied” (p. 100). As Suber puts it:
[P]atterns that can be imagined can be drawn. Patterns that are conceivable but not imaginable (like
Descartes’ chiliagon or 1000-sided polygon) can be described in a notation that provides a complete
recipe for conception; and the notation can be drawn. If something cannot be conceived, it probably
does not deserve the name of pattern. And what is drawn is thereby given a physical representation
that can be read or decoded by suitably designed machine (p. 100).
6 Unless otherwise indicated, all references in this section are to Suber (1988).
W.D. Duncan / Ontological distinctions between hardware and software 13

Thus, since all patterns can be physically embodied, all patterns can satisfy the physical condition of
readability.
The Digital Principle states that any pattern can be represented as a digital pattern (p. 101). For exam-
ple, any analog pattern, such as a painting, can be digitized. Once digitized, the grammatical condition of
readability and the requirement of executability are met. For within a digital pattern, the necessary dis-
tinctions are present for constructing syntactic structures, and these syntactic structures make it possible
for the digital pattern to be read and executed. From the Sensible and Digital Principles, then, it follows
that all patterns can satisfy the physical and grammatical conditions of readability and the requirement
of executability.
Since Suber uses the term “pattern” in a broad sense, the fourth premise (P4) is rather uncontroversial.
Suber then deduces from (P3) and (P4) that all concrete objects can satisfy the physical and grammatical
conditions of readability and the requirement of executability (C1). Once (C1) is established, the last
premise (P5) and final conclusion (C3) are trivial. As Suber states, “Hardware, in short, is also software,
but only because everything is” (p. 102). Suber’s inference from (P3) and (P4) to (C1), however, is
flawed. The reason for this is examined in the next section.

PY
2.2.1. Rejection that hardware is software
For all its ingenuity, Suber’s argument is not valid. The mistake in his reasoning most directly occurs
in the move from (P3) and (P4) to (C1): O
(P3) All patterns can satisfy the physical and grammatical conditions of readability and the require-
C
ment of executability.
(P4) All concrete objects display patterns.
(C1) All concrete objects can satisfy the physical and grammatical conditions of readability and the
requirement executability.
R

From the premise (P3) that all patterns can satisfy a certain property, and (P4) that all concrete objects
O

display patterns, it does not follow that all concrete objects can satisfy this property.7 To see this, consider
the following additional premise (P6) and conclusions (C4, C5):
TH

(P1) Software is a pattern that can be read and executed.


(P6) A peanut butter sandwich is a concrete object.
(C4) Therefore, a peanut butter sandwich is software (from C2).
AU

(C5) Therefore, a peanut butter sandwich is a pattern that can be read and executed (from P1 and C4).
Here, it is clear that a mistake has been made. A peanut butter sandwich is not a pattern. A peanut
butter sandwich possesses mass and provides sustenance, whereas a pattern per se is intangible. Thus,
from the premise that all concrete objects display patterns, it does not follow that all concrete objects are
patterns. Rather, the conclusion of Suber’s argument should be that all concrete objects display software
patterns.
So, how might a defender of Suber respond to my criticism? First, of course, one could charge that I
have not adequately represented Suber’s argument. To this I respond that even if I have misrepresented
some of the finer points of the argument, Suber’s conclusion that “everything determinate is software”
(p. 103) is clear. Thus, (C4) is still a valid inference, for from the assertion that everything is software, it
follows that a peanut butter sandwich is software.
7 Irmak (2012) makes the same point (see pages 58–59). However, he does not provide as much detail as I have. I discovered
Irmak’s paper after the writing of this section of the paper.
14 W.D. Duncan / Ontological distinctions between hardware and software

Second, one could hold that although (C4) sounds implausible, the argument is still, in fact, valid.
However, if one really wishes to hold to the view that everything is software, the question must be raised
as to how to distinguish the various types of software in the world. Suber’s argument not only asserts
that peanut butter sandwiches are software, but so are automobiles, shopping malls, and roller coasters.
By focusing solely on the nature of pattern and its role in defining software, Suber, like Moor, presents
an ontological perspective that does not mesh well with broader considerations about reality.

2.3. Software is a concrete abstraction8

Colburn (1999) argues that software is a concrete abstraction, that is, software has a dual nature that
consists of it being both concrete and abstract. Software, Colburn argues, is abstract in that it involves an
abstraction of content (p. 10). By this, Colburn means that abstraction in computer science, unlike ab-
straction in mathematics, does not eliminate empirical content. Rather abstraction in computer science
“enlarges the content of terms by bringing them to bear directly on things in a non-machine oriented

PY
world” (p. 14). These abstractions of content are then what enable the computer scientist to create soft-
ware programs that can be implemented in physical hardware (p. 12). The implementation of software,
thus, creates an object that is both concrete, since it is stored in physical memory, and abstract, since what
the program describes is an abstraction of content (p. 14). Hence, Colburn’s use of the term ‘concrete
O
abstraction’.
For example, consider the following LISP code for finding the largest integer in a list:
C
(setf mylist ‘(1 4 2 12 6 7 9))
R

(setf largest (first mylist))


(loop for i in mylist do
O

(if (> i largest) (setf largest i)))


TH

This code contains instructions for setting the value of the variable largest to the value of the first integer
in mylist, iterating over the list, and updating the value of largest if a larger value (in mylist) is found.
AU

For instance, if mylist holds the values 1, 4, 2, 12, 6, 7, 9, the value of largest will be 12 after executing
this code.
In this code, a significant amount of abstraction is at work. At the machine level, a computer consists
of a number of physical components, such as transistors. Instructions are input into the machine by
changing the physical states of the components, such as changing the charge held by a transistor, and
these changes (i.e., the input of the instructions) cause other changes to propagate through the machine.
But, a computer programmer does not necessarily have to be mindful of how the physical machine
works. The patterns of physical states that serve as instructions to the computer are represented by
the abstract descriptions formed by the statements made using the programming language (p. 12). The
programming language statements, however, do not eliminate the need for a programmer to reason about
how a machine would execute the statements. Rather, instead of thinking about how a particular machine
would execute the statements, the programmer can reason about how any machine could execute the

8 Unless otherwise indicated, all references in this section are to Colburn (1999).
W.D. Duncan / Ontological distinctions between hardware and software 15

statements.9 This move from thinking about particular machine’s operations to those operations that
hold across computing machines in general is what Colburn means by an “enlargement of content”.
If we wished, we could modularize the behavior of this code by constructing a procedure, let’s call it
find-largest-integer, in which the list is given as a parameter:

(defun find-largest-integer (parameter)


(setf largest-integer (first parameter))
(loop for i in parameter do
(if (> i largest-integer) (setf largest-integer i)))
largest-integer)

The largest integer in the list would then be found by calling this procedure:

PY
(setf mylist ‘(1 4 2 12 6 7 9))
(setf largest (find-largest-integer mylist))
O
In calling the procedure find-largest-integer, more detail is abstracted away from the process of how to
find the largest integer. For example, a programmer can use the find-largest-integer procedure without
C
knowing in which direction the list is searched (i.e., first to last versus last to first), or if the procedure
uses a loop or recursion to iterate over the list. This abstraction (i.e., find-largest-integer) does not elim-
inate the content of the procedure for finding the largest integer in a list. Rather, Colburn contends that
R

the content of this procedure has been enlarged so that we can now see a clear separation between how
procedures are defined and how they are called (pp. 13–14).
O

2.3.1. Critique of software being a concrete abstraction10


TH

Colburn’s notion of software as a concrete abstraction is troubling on three counts. First, it is not clear
how abstraction in mathematics and computer science differ. Mathematics is often used to model reality.
For example, a statistical model can be used to predict the spread of an infectious disease. On Colburn’s
description of mathematical abstraction, a model of this sort would be aimed at “eliminating” empirical
AU

content. I fail to see how this is the case. The statistical model, in fact, gives us better insight into how the
disease spreads. These insights allow us to investigate new ways to prevent and treat infectious diseases.
Thus, contra Colburn, the statistical model “enlarges” (in Colburn’s sense) the empirical content, not
eliminate it. Perhaps it could be argued that mathematical models of this sort employ computational ab-
straction, but this would only make the distinction between computational and mathematical abstractions
more confusing.
Second, Colburn’s notion of abstraction in computer science is too dependent on the use of program-
ming languages. That is, he relies too heavily on the fact that we have designed programming languages
in a manner such that they can be executed on a physical machine. What he does not take into account is
that, in principle, we can also develop the means to execute mathematical symbols on physical machines.
9 I am assuming, of course, that it is possible to translate the programming language statements into a form that the machine
can process.
10 Unless otherwise indicated, all references in this section are to Colburn (1999).
16 W.D. Duncan / Ontological distinctions between hardware and software

For example, suppose we wish to define the summation of the first ten positive integers (i.e., 1 to 10).
We can use the sigma notation to express this operation as follows:


10
i
i=1

We may also define this operation using the following LISP code:

(loop for i from 1 to 10 summing i)

An important difference between the LISP code and the sigma notation is that the LISP code can be
translated, by means of a compiler (a kind of computer program), into machine-readable instructions.
However, this difference obtains only because some programmer (or team of programmers) has taken the
time to develop the compiler. There is no reason in principle why such a program could not be developed

PY
for translating the sigma notation. Moreover, it is possible to develop programming languages for which
no compiler has yet been developed. Do statements in such programming languages also abstract in the
same way as programming languages for which compilers have been developed? Or are they more like
O
mathematical notations that in principle can be translated into machine code, but just have not been so
translated? The answers to these questions are not clear.
C
Last, even if we grant Colburn’s claim about abstraction in computer science, his description of the
ontological status of concrete abstractions is troubling. Instead of giving a clear ontological account
of concrete abstractions, he likens them to the pre-established harmony theory of mind (p. 15), and
specifically to the view that the immaterial mind and the physical brain do not causally interact. Rather,
R

according to this theory, mental and physical actions run in parallel in virtue of some kind of harmonious
O

relationship that guarantees that the actions of each are correlated. In a similar fashion, the abstract
content of software and the physical medium of hardware are seen by Colburn as acting in concert. But
what does this say about the ontological status of software? If we take software to have some kind of
TH

dual nature, as Colburn contends, then we are left wondering about the relationship between the abstract
features of software and their physical implementations. Perhaps Colburn is pointing to the fact we are
capable of building physical artifacts that behave according our abstract specifications. But, if this is the
AU

case, then is there anything unique about how this obtains in the domain of software? After all, the same
phenomenon occurs when we build all sorts of artifacts, including bridges and skyscrapers. Software
programs thus seem to be a particular case of a more general phenomenon. Colburn leaves too many of
these questions unanswered for his account of software to be ontologically satisfying.

2.4. Software is a specification11

Turner (2011) argues that the nature of computational artifacts is determined by specifications. A spec-
ification, for Turner, is something that has “correctness jurisdiction over an artefact” (p. 147). By “cor-
rectness jurisdiction”, Turner means that the specification places “empirical demands on the physical
device” (p. 144). If the physical behavior of the machine does not meet the demands of the specification,
then the physical machine is defective in some way. For example, if I build a physical implementation of
11 Unless otherwise indicated, all references in this section are to Turner (2011).
W.D. Duncan / Ontological distinctions between hardware and software 17

a stack and the device does not allow me to add and remove items from the top of the device, my device
is defective relative to the specification of a stack.
In order for specifications to have correctness jurisdiction over physical machines, there are number
of important features that specifications must have. First, specifications must be abstract. The reason for
this as pointed out by Kripke (1982), and expanded on by Turner (2010, 2011), is that our justification
for asserting that a physical machine has malfunctioned is not the machine itself. Rather, argues Turner
(my emphasis added):
Whether a machine malfunctions is not a property of the machine itself but is determined by its
specification. Indeed, this normative role of specification makes it evident that a specification not
only can be viewed as the definition of an abstract device, but must be so viewable. Physical devices
could not themselves act as definitional mediums. Otherwise, we could not provide a stable notion
of correctness: what is true of the physical device today might not be so tomorrow (p. 141).
A potential point of contention in this passage is Turner’s use of the term ‘abstract’. After all, the
history of philosophy is filled with debates about what it means for an object to be abstract. However,

PY
there is general agreement that whatever abstract objects are, they are causally inert and can only be
investigated through conceptual or mathematical means (p. 143).
Second, specifications must have what Turner calls a “logical core” (p. 138). Since specifications
O
are causally inert, their correctness cannot be established by empirical tests. Rather, a specification’s
correctness is established only by mathematical analysis (p. 143). To see this, consider the following
C
specification for a device that Turner calls “C” consisting of a store and two operations on the store
(p. 136):
1. The store has named locations that hold numerical values. Each location holds at most one value.
R

2. A Lookup operation extracts the content of named locations.


3. An Update operation provides the means of changing the contents of a location.
O

4. When the location’s contents are changed, the contents of the new location should be the only thing
changed.
TH

This specification has a logical core that permits use to reason about C’s characteristics and potential
behaviors, such as the need to specify what to do when the location we are examining is empty (p. 137).
AU

If we wished to make our reasoning about C more precise and explicit, we could define the specification
using some logical language (such as first-order logic). This would allow us to build a logical theory
about C.12 The important point to make here is not which language (formal or informal) is used to express
to the specification, but that the abstract nature of a specification permits us to subject the specification
to logical (or mathematical) analysis.
The third key feature of specifications is that they play a practical role in the construction of physical
machines (p. 140). A specification, says Turner:
[points] beyond itself in that it is taken to tell us what to build or construct. A specification provides
a description of the proposed artefact. In turn, this provides substance to their logical role which is
to provide a criterion of correctness or malfunction for purported artefacts. In this role, its function
is not mathematical or formal but practical (p. 140).

12 The details of doing this are beyond the scope of this paper, but the interested reader may be interested in reading the
original paper or Turner’s 2009 book Computable Models.
18 W.D. Duncan / Ontological distinctions between hardware and software

For example, if an engineer uses the specification for C (above) to build a machine, the specification
tells her what components we need to construct and what behavior the machine should have. That is, the
components of the machine must be capable of storing values, looking up values, and updating values.
The proper functioning of the machine is then determined by the specification for C. For instance, if the
Lookup operation for this machine fails to retrieve the contents at the specified location, something has
gone wrong. And the reason this counts as a malfunction is because it fails to meet the demands of the
specification for the machine.
Last, whether something counts as a specification depends on our giving it correctness jurisdiction
over something else (p. 147). A definition or description does not by itself have normative force. It is our
taking a definition or description to be a standard of correctness that provides this. For example, consider
the following LISP code:

(pprint “hello world”)

When executed on an appropriately configured machine, the output of this code is straightforward: “hello

PY
world” is printed to the screen.13 Now, suppose that a machine is not capable of running LISP code, but
can execute C++ code. We may then implement the LISP “hello world” code in C++ as follows:

#include <iostream>
O
int main() {std::cout << “hello world”; }
C
Whether the C++ program correctly implements the LISP program is determined by the correctness
conditions provided by the LISP code, namely, that the C++ program contains instructions for printing
R

the words “hello world”. The important point to grasp is that the LISP program is a specification for the
C++ program because we take the LISP program to have the requisite normative force. In this example,
O

the situation can easily been reversed. The C++ program can be taken to be the specification for the
LISP program.
TH

With these four key features of specifications in mind, let us apply Turner’s notion of specification to
hardware and software. An example of how the specifications relate to hardware has been supplied in the
specification for C. When taken as a specification for a physical machine, C defines a device that stores,
AU

retrieves, and updates values stored in particular memory locations. Many computing devices have com-
ponents that perform similar operations, although their specifications are more complex than C. For
instance, the architecture for a central processing unit (CPU) defines, among other things, the instruction
set for storing values and accessing memory, the number of registers, and the kinds of data types that the
CPU can process.
Now let us consider software. There are two cases I wish to consider. The first is when something
places demands on what behaviors the software should encode. Again, the specification for C can serve
as an example. If we take C to be a computer program instead of a physical machine, then this specifi-
cation plays a critical role in determining the content of the software. For example, if the software does
not contain code for retrieving values from memory locations, then the software is defective relative to
C’s specification.
13 Strictly speaking, the string “hello world” is sent to the standard output of the computer system. In some cases, this is not
computer system’s monitor. However, for purposes of this paper, we will assume that the computer system’s monitor is the
standard output.
W.D. Duncan / Ontological distinctions between hardware and software 19

The second case is when the software makes empirical demands on a physical device. A software
program consists of instructions written in some programming language. Because of how program-
ming languages are designed, the instructions expressed using a programming language are computable,
meaning that they can (in principle) be carried out on a Turing machine (or an equivalent formalism).
When loaded onto a physical machine, the software then becomes a specification for the machine that
executes the instructions. For example, the LISP “hello world” code (above) requires that in order for
a physical machine to correctly execute it, “hello world” must be printed to the screen. A physical ma-
chine that fails to execute this code is defective in some way. This could be because the machine is
physically damaged or, perhaps, the machine does not have other supporting software installed, such
as a LISP interpreter, that is needed to execute the instructions. What is important in the context of the
current discussion is that the malfunction occurs because the standards of correctness given by the LISP
code could not be met. The code is a specification against which the proper behavior of the machine is
determined.
Before closing this section, let us consider a possible objection to the requirement that specifications
be abstract. Suppose an automobile enthusiast wishes to replicate an antique car. In this case, is the

PY
physical object (i.e., the original car) serving as a specification for the replica? Although it is tempting
to answer “yes” to this question, a little reflection shows that an abstract notion of specification is still
at work. Consider the following two points. First, what happens if the original car changes (after a copy
O
has been made)? Does the replica become defective because it no longer mimics the original physical
object? The answer to this is “no”. Rather, we recognize that the replica is still correct relative to the
C
original version of the automobile. This original version no longer physically exists, but what does exist
is our abstract representation of this original version. And from the standpoint of taking this abstract
representation as the specification, the replica remains correct. Thus, the original physical automobile
did not in and of itself act as a specification for the replica. For, if this were the case, physically changing
R

the original automobile would invalidate the replica.


Second, we must ask whether the original automobile can malfunction. If its engine starts leaking oil
O

does this count as defect? The answer to this is, of course, “yes”. The reason this counts as a defect
is because we understand how the engine should operate, and this normative judgment is based on our
TH

abstract understanding of a properly functioning car engine. If our determination of malfunction was
based solely on the physical automobile itself, we could not determine that the leaking of oil was a
defect. Thus, even in the case of copying physical objects, it is our abstract understanding of these
AU

objects that acts as a specification.


2.4.1. Critique of software being a specification
I find Turner’s arguments convincing. His notion of specification has strong intuitive appeal (i.e.,
specifications giving correctness conditions), and he avoids having to make sense of Colburn’s concrete
abstractions. He sidesteps the issue of what, ontologically speaking, an abstract object is, but he makes
clear that such abstract objects are accessible. Although abstract specifications are not physical, they
have a logical core that permits us to subject them to logical investigation. The question of whether his
notion of abstract objects depends on the physical world is left open. Colburn’s notion of abstract objects
is also left open, but since there is no account of correctness, it is very different from Turner’s.
A shortcoming of Turner’s theory, however, is that in order to make ontological distinctions between
hardware and software, we need an ontological theory, and Turner does not give us one. That is, although
Turner has shown us the important role that specifications play in our understanding of computational
artifacts, he has not provided us with a detailed account of what kinds of entities hardware and software
are. Presumably, hardware is kind of physical entity, but so is software when considered as the magnetic
20 W.D. Duncan / Ontological distinctions between hardware and software

patterns that are stored on a hard drive or the pits and lands on a DVD. This ontologically blurs the
distinction between the physical entity that encodes programming language instructions and the specifi-
cation that determines the correct execution of a program’s instructions. In order to make this distinction
clear, an ontological account needs to be given for specifications, hardware, and software.

3. Ontologies of software

Thus far, my review of the relationship between hardware and software has focused mainly on review-
ing broad metaphysical accounts of hardware and software. I have not been concerned with providing a
systematic account of the categories under which hardware and software fall and how these categories
are related. In other words, I have not been focused on reviewing ontological accounts of hardware and
software. My reason for this is that in order to develop a systematic account of hardware and software,
we need to first have a solid understanding metaphysical issues surrounding them. However, there are a
handful of ontologies for hardware and software. These ontologies include: Lando et al. (2009), Oberle

PY
et al. (2009), Malone et al. (2014), Irmak (2012), and Wang et al. (2014). Although each of these on-
tological accounts of software provide interesting systematic accounts of software, providing a detailed
analysis of them would be too time consuming. Instead, I will focus on some commonalities that these
ontologies share. O
One common theme that these ontologies of software share is that they recognize the software’s in-
structions (in the informational sense of ‘instruction’) as being distinct from the particular code used
C
to express the instructions. For instance, Lando et al. (Section 3.1) defines a computer program as both
an “Artifact of Computation” and “Computer Language Expression” (Lando et al., 2009, p. 6). A com-
puter program is a “Computer Language Expression” because of the syntactic nature of programming
R

languages used to write computer code, and an “Artifact of Computation” because the computer lan-
guage expressions tell agents (machines or humans) how to compute. Oberle et al. makes a distinction
O

between code as a kind of information object, and the realization of the code in some concrete object
(Oberle et al., 2009, p. 386). Malone et al. define software as an entity that implements an algorithm
TH

and is encoded in some programming language (Malone et al., 2014, pp. 5–6).14 The algorithm, which
is a primitive, provides the instructions expressed by the programming language statements. Irmak ar-
gues that software is an abstract artifact. Software is an artifact because some agent creates it for some
purpose. But, software is also abstract because of its non-spatio-temporal properties, with particular
AU

program files being concrete spatio-temporal objects that encode software (Irmak, 2012, p. 58). Lastly,
Wang et al. incorporates Irmak’s notion of software as an abstract artifact and adds more rigor to Irmak’s
idea by making use of Baker’s constitution relation (Wang et al., 2014, pp. 5–6). A computer program,
for Wang et al., has one or more proper functions, and, thus, a computer program an artifact. Computer
code, on the other hand, is not necessarily an artifact because lines of computer code may be created
by chance (Wang et al., 2014, p. 5). Computer code created in this manner lacks a proper function,
and therefore is not an artifact. By making distinctions between an abstract computer program and a
particular encoding of it, Wang et al. and Irmak can distinguish between the instructions of the abstract
computer program and the programming language code that is used to express the computer program’s
instructions.
I agree with the need to distinguish between an abstract set of instructions and the various ways we can
encode these instructions. Thus, although I have disagreements with the ontological approaches defined
14 Other relations are defined, but these seem most essential.
W.D. Duncan / Ontological distinctions between hardware and software 21

in the above ontologies, I view the distinction between software instructions and encodings as common
strength these ontologies share. In the remainder of this section however, I will ignore the artifactual
nature of software programs (i.e., as material entities that bear encoded instructions), concentrating
rather on software code. Concerning the latter, there are two areas in which these ontologies fall short.
The first is that the ontologies are either silent about the distinction between hardware and software code,
or they do not develop the framework necessary to make the distinction. For example, Oberle et al. and
Wang et al. distinguish between software and the physical objects that bear inscription of software code.
However, they do not further develop this distinction, and are therefore subject to Moor’s contention that
the distinction between hardware and software is not ontologically significant.
Second, most of the above ontologies do not recognize the crucial role that specifications have in
defining the notion of software code. Wang et al. do give mention of software program specifications.
However, they do not fully explore the normative dimension of specifications, and do not seem aware
that in Turner’s theory a software program’s code is a specification. Rather, they define a specification as
being separate from the software program (Wang et al., 2014, p. 5). With these considerations in mind,
let us now turn to giving an ontological account of hardware and software that gives adequate attention

PY
to specifications.

4. Building an ontological account of hardware and software


O
C
4.1. Need for ontology

Above, I have argued that part of the shortcoming with the positions of Moor, Suber, Colburn, and
Turner is that are not ontologically rigorous. Moor’s and Suber’s treatment of hardware and software do
R

not hold up well against more general considerations of reality. Colburn’s description of software as a
concrete abstraction is difficult to understand, and although Turner provides insight about the relation-
O

ship between abstract specifications and concrete computing machines, more detail is needed in order to
ontologically differentiate between hardware and software. If my criticisms are to have force, however,
TH

I need to provide an ontological account of the kinds of entities hardware and software are and how they
are related.
For this account, I will be working with Turner’s position that software is a kind of specification.
AU

This will require an analysis not only of what kind of entity a specification is, but also a number of
related entities, including artifacts and Turner’s notion of “correctness jurisdiction”. Once this analysis
is complete, we will then be able to formulate definitions that account for hardware and software.

4.2. Analysis of specifications

To begin constructing this ontological framework, let us start by analyzing specification. Recall, a
specification for Turner is an entity that holds correctness jurisdiction over an artifact (Turner, 2011,
p. 197). This means there are at least three entities that need to be ontologically account for:
1. The artifact that is governed by the specification.
2. The notion of correctness jurisdiction.
3. The thing that holds the correctness jurisdiction over the artifact.
Let us look at each in turn.
22 W.D. Duncan / Ontological distinctions between hardware and software

4.3. Artifacts

The philosophical literature on artifacts is quite rich.15 Thus, providing a detailed analysis of artifacts
would be lengthy undertaking that would not necessarily serve us well. Rather, we will restrict the use
of the term ‘artifact’ to those entities that are:16
1. Physical in nature, that is, spatially extended entities that have shape (and often a mass, although
not necessarily).
2. Intentionally modified by an agent in order to serve in achieving some end.
3. Recognized within some community as having been created to serve in achieving a particular end.
The first of these restrictions entails that certain abstract entities that are sometimes referred to as
artifacts, such as algorithms and scientific theories, are not within the present scope of artifacts under
discussion. This is a somewhat more narrow sense of ‘artifact’ than considered by Turner.
The second restriction limits our discussion of artifacts to objects that were intentionally created for
some purpose. A hammer is a material object that has been modified so that we can pound nails. A hard

PY
drive is a material object that has been modified so that it can store information in a magnetic medium.
An object created for one purpose may be appropriated for some other use, for instance a hard drive may
serve as a doorstop, but this is not the intended use of the object.
O
The third restriction addresses the social dimension of artifacts. An agent may modify a physical
object to serve an immediate need. For example, I may sharpen the end of a stick so that I can more
C
easily cook a hotdog over a campfire. However, the purpose of an object modified in this immediate way
does not necessary gain widespread recognition within the agent’s community. Someone else finding
my sharpened stick may think that the stick was used as a tent stake. Artifacts, in general, have a wider
scope of applicability. We recognize that pliers are good for gripping and keyboards are used for typing
R

because the intended uses of these objects has been communicated to us by others within the community.
The process by which an object comes to be understood by the community as being designed to serve
O

some purpose is a complex issue worthy of investigation in its right. However, for now, it will have to
suffice for us to rely on our intuitions about what kinds of objects have this status.
TH

4.4. Correctness jurisdiction as a normative role


AU

Next we turn our attention to Turner’s notion of correctness jurisdiction. To discern what kind of entity
this is, I will draw upon some basic distinctions made by the Basic Formal Ontology (BFO) (Arp et al.,
2015). BFO makes a distinction between dispositions and roles. Both dispositions and roles are a kind of
realizable entity. A realizable entity in BFO is an entity that is realized (made manifest or exhibited) by
certain kinds of corresponding processes (Arp et al., 2015, p. 98). For example, the disposition of water
to boil is realized by the process of heating some portion of water to the requisite temperature. The role
of being an employee (at a company) is realized when a person performs his or her work duties.
The distinction between dispositions and roles lies in how they are grounded. Dispositions are inter-
nally grounded, meaning that the behavior exhibited by a disposition is based on the physical makeup of
the entity that possesses the disposition (Arp et al., 2015, p. 101). Because of this, an object cannot cease
to possess a disposition without it being physically changed. Some portion of water, for instance, cannot
15 Prominent examples in the literature include Dipert (1993), Millikan (1999), Baker (2007), and Houkes and Vermaas
(2010).
16 This is mostly in line with the conception of artifacts discussed in Dipert (1993).
W.D. Duncan / Ontological distinctions between hardware and software 23

lose its disposition to boil without its physiochemical structure being changed. We can, of course, have
some control over the conditions in which a disposition is expressed. For example, we can change the
air pressure surrounding some portion of water and thereby change the temperature at which the water
will boil. But this does not remove the water’s disposition to boil.
Roles, on the other hand, are externally grounded, meaning that an entity comes to possess a role due
to something that obtains or happens external to it (Arp et al., 2015, p. 98). A person, for instance, is an
employee at company (i.e., takes on the role of being employed at company) because of an agreement
reached between the person and the company. The person agrees to perform certain kinds of work, and
the company agrees to compensate the person for this work. Since roles are externally grounded, they
are not based on the physical makeup of the entities that possess them. This entails that an entity may
lose a role without it necessarily being physically changed. For example, a person may cease to be an
employee due to the company going bankrupt. But the loss of this role (i.e., being an employee) does
not necessitate a physical change in the person.
With this distinction between disposition and role in mind, let us turn now to Turner’s notion of
correctness jurisdiction. As noted by Turner, if an entity x has correctness jurisdiction over another

PY
entity y, then x makes empirical demands on y (Turner, 2011, p. 144). That is, x has normative force
over the construction or behavior of y, and x has this force because of conditions external to it, namely
that we treat x as having this kind of authority (Turner, 2011, p. 147). The language used by Turner
O
suggests that his notion of corrective jurisdiction falls within BFO’s role category. But, the similarity
runs deeper than language. It is metaphysical too. An entity can lose its correctness jurisdiction without
C
being physically changed. For instance, a blueprint has correctness jurisdiction over how a building is
constructed. But, if the blueprint is discovered to contain a mistake (e.g., the dimensions of a room are
incorrect), it will no longer serve as a normative guide for constructing the building. The blueprint itself,
however, does not necessarily physically change. It may simply be set aside while another blueprint is
R

drawn up. In other words, the blueprint’s loss of correctness jurisdiction over the construction of the
building did not require a physical change to the blueprint itself.
O

In a similar manner to the blueprint, software programs have normative force over the behavior of
the physical machines that execute them. A machine reads the instructions encoded in the software and
TH

carries out the instructions accordingly. If the machine fails to execute the instructions, we determine
that the machine has malfunctioned. But, if the software is found to contain a mistake (i.e., a “bug”) we
no longer take the machine to be malfunctioning. Rather, the software loses its correctness jurisdiction
AU

over the expected behavior of the machine.17 The software program remains physically unchanged, but
its normative role in governing the behavior of the machine changes. The machine is behaving badly,
but only because the software program is giving it bad instructions.
Thus, we arrived at an ontological account of Turner’s notion of correctness jurisdiction. It is a role,
which I will simply call a ‘normative role’, in which an entity is given (or granted) the authority to govern
how a physical device is constructed or behaves. There is, however, one additional adjustment that needs
to be made to this account. Turner’s explication of correctness jurisdiction focused specifically on the
relationship between specifications and physical devices. This focus on physical devices is too limiting,
for not every specification is clearly aimed at governing a physical device. For instance, a blueprint
is a specification for the dimensions of a building, but a building is not a physical device in the usual
sense. Moreover, we can have specifications of specifications. For example, a specification for a software
17 The question of whether a software program holds correctness jurisdiction over a machine is not always satisfied by a
simple “yes” or “no” answer. Often, many of features of a software program run as intended, with the incorrect instructions
affecting only a subset of the program’s intended functionality.
24 W.D. Duncan / Ontological distinctions between hardware and software

program may state what kind of operating system the software should be compatible with. To address
this, I will broaden the term ‘normative role’ so that it applies to entities in general, not just physical
devices. That is, entities possessing normative roles may determine the correct structure or behavior of
wide range of entities, including but not limited to physical devices.

4.5. Relating specification and normative roles

In the previous section, I argued that Turner’s notion of correctness jurisdiction is a normative role.
If we take this to be the case, the next issue to be addressed is how specifications and normative roles
are related. In order to determine this, we need to be careful to distinguish between the information
that tells us the requirements that are enforced and various patterns of symbols or markings that we
use to represent this information. As an example, consider a word processing program that has been
installed on a typical laptop computer. Within the laptop, the software program exists as a series of
magnetic patterns on its hard drive. The structure of these magnetic patterns is such that they represent
detailed information about how to perform certain operations, and when these magnetic patterns are

PY
read (or input) into other components in the computer system, the components within the laptop behave
accordingly. In other words, the magnetic patterns encode instructions in a format that the computer
system has been designed to read and execute. The software program’s instructions, however, are not
O
identical with the magnetic patterns that encode the instructions. A particular instruction may exist in a
number of different ways. It may exist as a magnetic pattern on a hard drive, pits and lands on DVD,
C
a series of holes in a punch card, or ink on a sheet of paper. A particular encoding of an instruction,
such as printed words in a textbook, however, exists only if the object that the encoding is part of exists.
Thus, there are two different senses of ‘instruction’ that we have to be mindful of. The first sense is that
of instructional information, and this information simultaneously can exist in many different kinds of
R

media. The second sense is that of the particular physical patterns that are created on a physical object
O

by encoding the instructional information. In order to keep these senses from getting confused, I will
use the term ‘instruction’ to refer to the informational sense of the term, and the term ‘concretization’ to
refer to the patterns that encode information.18
TH

This distinction between instructions and concretizations may tempt one to equate the relation be-
tween these entities as being a relation between universals and instances. However, this is not the case.
Although a particular concretization may encode information, the concretization is not itself an instance
AU

of information. To see this, consider a hypothetical programming language P and a concretization that
encodes an instruction written in P to print the words “hello world”. Now suppose that P ceases to exist
as a programming language. In such a case, the patterns that encoded the instruction would continue to
exist even though the instructional information did not exist. If the relationship between an instruction
and its concretization were the same as the relationship between universals and instances, this could not
happen, since instances exist in virtue of the universals they instantiate.
With this distinction between instructions and concretizations in place, we are now in a position to
consider how normative roles and specifications are related. There are two options available: (1) the
concretization of an instruction bears the normative role, or (2) the instructional information bears the
normative role. To this end, I submit that (1) is the most viable option. My reasons for this are as follows.
First, as noted, instructions can exist in a number of different formats. If we allow that instructions may
make empirical demands on a device, then it opens up the possibility that a particular instruction may at
18 The term ‘concretization’ is found in the Basic Formal Ontology (BFO) (Smith et al., 2013, p. 63).
W.D. Duncan / Ontological distinctions between hardware and software 25

the same time both have and not have a normative role. For example, a particular instruction encoded on
a computer’s hard drive makes empirical demands on the other components of the computer system. But
the same instruction painted on a canvas as a work of art does not make such demands. The difficulty of
making sense of how the same instruction could both have and not have (at the time) a normative role
in determining the behavior of a physical object makes instructions an implausible candidate for having
such a role. Second, the non-physical nature of information makes it unclear how an instruction could be
placed in the circumstances necessary for it to come to have a normative role. In other words, if roles are
externally grounded entities, this means that there is something external to the entity bearing the role that
gives rise to the creation of the role. However, it is difficult to make sense of something being external
to an instruction, for it unclear what the boundary conditions would be for demarcating an instruction’s
internal and external boundaries.
Concretizations, on the other hand, do not give rise to these kinds of difficulties. Since a concretiza-
tion occupies physical space, it cannot simultaneously exist in different locations, and at least at the
macroscopic level of reality, we can determine boundary conditions between spatial entities. Thus, it is
concretizations of instructions that bear normative roles, for the spatial nature of concretizations allows

PY
us to clearly determine which entity bears the role. The pits and lands on a particular DVD, for instance,
have correctness jurisdiction over a machine because of their normative role in governing the activities
of the computer system. In time, DVDs, like punch cards and floppy disks, will cease to be used as the
O
means for loading information into a computer system. When this happens, the DVD’s pits and lands
will no longer convey instructions to a computer system, and therefore they will lose its normative role
C
in governing the behavior of a machine.
A point of contention here is that in having the concretizations bear normative roles, I am open to the
criticism of having the physical object be the standard of correctness. That is, I am open to Tuner’s crit-
icism (discussed above) regarding the abstract nature of specifications. However, this criticism confuses
R

the instructions with the normative role borne by the concretizations. The instructions are still abstract in
O

Turner’s sense of term. We can subject the instructions to mathematical analysis. These instructions only
become authoritative when their concretizations take on a normative role. This role is then realized (or
exhibited) when the behavior or structure of some physical device is evaluated according the information
TH

communicated by the concretizations. The role, itself, however, does not have instructional information.
Lastly, before leaving this section, a potential confusion regarding the distinction between an arti-
fact’s function and its role needs to be addressed. Artifacts, as discussed above, are material entities that
AU

have been selected to serve in the attainment of some end. In many cases, the process of designing of
an artifact involves selecting and enhancing an entity’s dispositions so that the manifestation of these
dispositions results in behaviors that serve the designer’s goals. For example, in designing a hammer,
certain materials are chosen because of how they react when striking another object, and the shape of
the hammer enhances this behavior. A disposition of an artifact that has been selected and, possibly,
enhanced in this way, I shall simply call an artifact’s function.
A computer system consists of a number of physical components. Each of these components has one
or more functions that contribute to the operation of the machine. For example, a hard drive has the
function to store and retrieve information encoded in a magnetic medium. It does this by reading and
changing the polarization of regions on the magnetic platters that are part of the hard drive. The hard
drive’s function to manipulate magnetic patterns in this way, then, contributes to the overall operation of
the machine by providing a store where instructions can be saved and retrieved.
The functions of an artifact, however, are distinct from the role an artifact may have. A hard drive’s
function is to store and retrieve magnetic information. But whether the electronic signals that result from
26 W.D. Duncan / Ontological distinctions between hardware and software

the hard drive reading the magnetic patterns from its platters should result in certain other operations
being performed by the components of computer system is a separate issue. Typically, we hold that a
properly constructed computer system will behave according to the instructions that have been encoded
on the hard drive. But this is because the hard drive has a certain role within the computer system as a
whole.

4.6. Programming language specifications and software instructions

In the preceding sections, I argued that concretizations which encode instructions possess norma-
tive roles. If this account is going to play a part in our ontological account of software, some account
needs to be given of how these concretizations are related to the programming languages used to encode
the instructions. Generally, we think of a programming language as an artificial language that we use
to tell a computer (or machine) what to do. Unfortunately, since this way of thinking about program-
ming languages appeals to our understanding of what a language is, it does not ontologically tell us all
that much unless we also provide some metaphysical account of what a language is. Rather than delve

PY
into the thorny issue of what a language is, I will focus only on programming language specifications.
These specifications give the syntactic rules for creating valid programming language expressions and
the semantics for determining the operations that should be carried out when executing a programming
language expression. For instance, the LISP expression “(+ 1 2)” is valid because it follows the syntactic
O
rules provided by the LISP programming specification.
Programming language specifications fall under Turner’s notion of ‘specification’, for they have au-
C
thority in determining whether the syntax of a given program is valid and whether a system designed
to execute the programming language expressions is behaving correctly. Moreover, in the account I am
advocating here, it the concretizations of programming language specifications that possess normative
R

roles. For example, the Java Language Specification19 is a particular concretization that specifies the
syntax and semantics for creating and using Java expressions.
O
TH

5. Defining an ontology for hardware and software

5.1. Definitions
AU

Thus far, our examination of Turner’s notion of ‘specification’ has resulted in the need to define the
following entities: artifact, realizable entity, role, normative role, disposition, function, instruction, and
concretization. Taking into consideration the above discussions of these entities, we define them as
follows:
1. artifact: a material entity that has been intentionally selected by some agent to serve in attaining
some end and is recognized in some community as having been created for some purpose
2. realizable entity: an entity that is realized (made manifest or exhibited) when its bearer participates
in certain kinds of processes
3. role: an externally grounded realizable entity that characterizes how its bearer behaves in certain
situations
4. normative role: a role that characterizes its bearer as having the authority to govern what the correct
structure or behavior of an entity should be
19 https://docs.oracle.com/javase/specs/jls/se8/html/index.html
W.D. Duncan / Ontological distinctions between hardware and software 27

5. disposition: an internally grounded realizable entity that characterizes how its bearer behaves in
certain situations
6. function: a disposition that has been selected either by intentional design or nature
7. instruction: information that tells how to do something or perform some operation
8. concretization: a pattern that is displayed by a material entity and encodes information
Using definitions 1–8 as a basis permits us to define the terms ‘specification’, ‘programming language
specification’, ‘programming language expression’, and ‘programming language instruction’ as follows:
9. specification: information whose concretizations bear some normative role
10. programming language specification: a specification that provides:
a. the syntactic rules that an expression must satisfy in order to belong to a particular program-
ming language
b. the semantics for determining the computational operations that result from executing such
expressions

PY
11. programming language expression: information which is such that the syntax of its symbols is
governed by some programming language specification
12. programming language instruction: a programming language expression whose concretization
O
bears a normative role that is realized by processes that govern the behavior of an entity perform-
ing symbolic manipulations in virtue of the programming language specification used to form the
C
programming language expression
Finally, based on definitions 1–12, we can now define the terms ‘software program’ and ‘hardware’ in
the following manner:
R

13. software program: a specification that consists of one or more programming language instructions
and whose concretization is embodied by an artifact that is designed so that a physical machine
O

may read the concretized instructions


14. computer hardware: an artifact whose functions are realized in processes that directly or indirectly
TH

bring about the result of some calculation

5.2. Discussion of definitions


AU

Definitions 1–14 lay out an ontology that accounts for software programs and computer hardware. In
the definitions, we find a number of distinctions. The first is that a software program is a specification,
whereas hardware is not. This entails that a concretization of a software program bears a normative role,
and this role permits the software program (in virtue of its concretization) to make empirical demands
on hardware. For example, when a software program is concretized in a machine’s electronic memory,
and the CPU fails to execute the software program’s instructions, we determine that the CPU is malfunc-
tioning because its behavior does not comply with the instructions expressed by the concretizations of
the software program’s instructions.
The second is that although a software program is a specification, the programming language instruc-
tions must be encoded in a medium that the hardware can read and execute. This addresses the artifactual
nature of software programs. Not every concretization of a programming language expression is a soft-
ware program. Computer code, for instance, that is printed on T-shirts or coffee mugs does not meet
the conditions necessary for being a software program. For, although in theory it possible to design a
28 W.D. Duncan / Ontological distinctions between hardware and software

machine to read the programming language instructions, these entities where not specifically designed
for this purpose.
The third is that hardware possesses functions, whereas the software program’s instructions do not.
The function of an arithmetic logic unit (ALU), for instance, is to carry out mathematical operations
on integers encoded as electronic signals, and the ALU has this function because of how it has been
physically constructed. In contrast, programming language instructions are a type of information. Since
information is not a kind of material entity, it cannot possess functions, in the sense of a function being
a selected disposition. This, however, raises an interesting issue of whether the concretization (of pro-
gramming language instructions) can bear a function. It is clear that in many cases such concretizations
do not possess functions. For example, if the instructions are printed on a sheet of paper, this concretiza-
tion does not possess functions for carrying out the instructions. But, consider the case in which the
programming language instructions are modeled as a series of logic gates. Is this a concretization of the
instructions? Does it possess the function to carry out the operations that result when electronic signals
are passed through the logic gates?

PY
To answer these questions, we must draw a distinction between the patterned way in which the logic
gates are connected and the logic gates themselves. The pattern of how the logic gates are connected is
the concretization of the program language instructions, and as such does not possess functions. This
O
pattern is distinct from the physical logical gates that do in fact route and modify electronic signals.
To see this distinction, consider a case in which a series of logic gates is correctly connected (i.e., the
C
pattern of connections is correct), but one of the logic gates in the series is defective. In such a case, the
defective logic gate is the reason for the malfunction, not the pattern. If the concretization of the program
language instructions (i.e., the pattern of connected logic gates) and the logic gates (themselves) were
identical, it could not be the case that the pattern was correct while the logic gate was defective. Thus, in
R

cases in which program language instructions are modeled as physical components, such as in a series
O

of logic gates or circuits, it is the pattern of the components that is the concretization of the instructions,
and this pattern does not possess functions to carry out calculations. Rather, it is the functions of the
TH

components that possess the capacity to carry out the calculations.


Before closing this section, there are three points to make about the definitions. First, not every term
used in the definitions is defined, for example, the terms ‘information’, ‘pattern’, ‘encode’, ‘calculation’,
and ‘material entity’. The lack of definitions for these (and other) terms, however, is not damning. Any
AU

rigorous ontological analysis of software and hardware will encounter these terms (or similar terms),
and, thus, I have chosen in my analysis to leave these terms as primitive or undefined.
Second, I do not give an account of computation. My reason for this is similar to the point in the
previous paragraph. Any rigorous analysis of hardware and software will involve some understanding
of computation. In this analysis, I have to chosen to let our understanding of computation operate as a
primitive notion. Engaging in a detailed analysis of computation would involve providing an account for
a number of abstract entities, such as Turing machines and algorithms, and this would to be too large a
task for the present work.
Third, and perhaps unexpectedly, we now have a means to make sense of Colburn’s description of
software as a being a concrete abstraction. A software program is abstract in the sense that instructions
are abstract, but the concretizations of the software’s programming language instructions are encoded
as part of concrete entities. Thus, by making this distinction, we do not have to commit ourselves to
software programs being at the same time both abstract and concrete.
W.D. Duncan / Ontological distinctions between hardware and software 29

6. Conclusion

The goal of this paper was to investigate the ontological nature of software and hardware and to de-
termine whether there is any clear criterion for distinguishing the two. In pursuit of this, I reviewed
the positions of some notable philosophers in this debate, and I developed an ontological account of
computer hardware and software programs. In this account, hardware and software turned out be very
different kinds of entities: A software program is a specification that consists of one or more program-
ming language instructions and whose concretization is embodied by an artifact that is designed so that a
physical machine may read the concretized instructions, whereas hardware is an artifact whose functions
are realized in processes that directly or indirectly bring about the result of some calculation. Thus, in
contrast to the positions of Moor and Suber, we have good reasons for distinguishing the two. Further-
more, this account provides us with an ontological framework for understanding Colburn’s notion of
‘concrete abstraction’ and Turner’s notion of ‘specification’.

PY
Acknowledgements

I would like to thank the following individuals for their helpful comments: Amnon Eden, Jeff Gower,
Pierre Grenon, Mark Holliday, Werner Kuhn, William Rapaport, Barry Smith, Raymond Turner, Neil
O
Williams, Alex Cox, Mark Jensen, and Alan Ruttenberg. I am especially grateful for the comments
given by Nicola Guarino during the journal review of this paper. I would also like to thank the National
C
Center for Geographic Information and Analysis Integrative Graduate Education and Research Training
Program (NCGIA IGERT) at the University of Buffalo and the International Research Training Group
(IRTG) at the University of Muenster for providing funding while I was working on earlier versions of
R

this paper.
O

Appendix. Definitions of hardware and software


TH

This appendix contains a list of definitions and discussions concerning the nature and/or differences
between hardware and software. Items (1)–(4) are of my own creation; (5)–(8) are taken from standard
English dictionaries; (9)–(12) are taken from dictionaries specific to computing; (13)–(14) are from the
AU

Encyclopedia of Computer Science; and (15)–(30) are taken from textbooks.


1. Software is modifiable whereas hardware is not.
2. Software is portable whereas hardware is not.
3. Hardware consists of the various physical components that execute the instructions in software.
4. Software consists of sets of instructions that are executed by the physical components of the com-
puter.
5. Hardware: Major items of equipment or their components used for a particular purpose as: The
physical components (as electronic and electrical devices) of a vehicle (as a spacecraft) or an ap-
paratus (as a computer).20
6. Software: The entire set of programs, procedures, and related documentation associated with a
system and especially (most commonly) a computer system.21
20 Merriam-Webster’s Collegiate Dictionary, 11th edition.
21 Ibid.
30 W.D. Duncan / Ontological distinctions between hardware and software

7. Hardware: The physical components of a system or device as opposed to the procedures required
for its operation; opposed to/opposite of software.22
8. Software:
a. The programs and procedures required to enable a computer to perform a specific task, as op-
posed to the physical components of the system.
b. The body of system programs, including compilers and library routines, required for the oper-
ation of a particular computer and often provided by the manufacturer, as opposed to program
material provided by a user for a specific task.23
9. Hardware: The physical portion of a computer system, including electrical/electronic components
(e.g., devices and circuits) electromechanical components (e.g., a disk drive), and mechanical (e.g.,
cabinet) components.24
10. Software: A generic term for those components of a computer that are intangible rather than phys-
ical. It is most commonly used to refer to the programs executed by a computer system as distinct
from the physical hardware of that computer system, and to encompass both symbolic and exe-

PY
cutable forms for such programs.25
11. Hardware: All electronic components of a computer system including peripherals, circuit boards,
and input and output devices.26 O
12. Software: A computer program; a set of instructions written in a specific language that commands
the computer to perform various operations on data contained in the program or supplied by the
C
user.27
13. Very early in the development of computers, people referred to the actual physical components –
the tubes and relays, the transistors and wires, the chassis – as the computer hardware. The word
R

software was then coined to describe the non-hardware computer, in particular the programs that
were needed to make the computers perform their intended tasks (Rosen, 2000: 1599).
O

14. Unlike hardware, software is not tangible. Software, although held in a physical medium, say on
a disk storage unit, is composed of programs and data arranged in logical, not physical structures
TH

(Hellerman, 2000, p. 431).


15. The hardware – the central processing unit (CPU), the memory, and the input/output (I/O) devices –
provides the basic computing resources. The application programs – such as word processors,
AU

spreadsheets, compilers, and web browsers – define the ways in which these resources are used to
solve the computing problems of the users (Silberschatz et al., 2002, p. 4).
16. Hardware consists of tangible objects – integrated circuits, printed circuit boards, cables, powers
supplies, memories, card readers, line printers, and terminals – rather than abstract ideas, algo-
rithms or instructions. . . . Software, in contrast, consists of algorithms (detailed instructions telling
how to do something) and their computer representations – namely, programs (Tanenbaum, 1990,
p. 11).
17. Hardware and software are logically equivalent (Tanenbaum, 1990, p. 11).

22 Oxford English Dictionary, accessed March 29, 2011.


23 Ibid.
24 Dictionary of Computing. 4th edition. New York, Oxford: Oxford University Press. 1996.
25 Ibid.
26 Prentice Hall’s Illustrated Dictionary of Computing. Jonar C. Nader (Ed). New York: Prentice Hall. 1998.
27 Ibid.
W.D. Duncan / Ontological distinctions between hardware and software 31

18. Programming is now much easier. Instead of rewiring the hardware for each new program, all we
need to do is provide a new sequence of codes. Each code is, in effect, an instruction, and part
of the hardware interprets each instructions and generates control signals. To distinguish this new
method of programming, a sequence of codes or instructions is called software (Stallings, 2010,
p. 68).
19. The hardware of a computer consists of all the electronic components and electromechanical de-
vises that comprise the physical entity of the devise. Computer software consists of the instructions
and data that the computer manipulates to perform various data-processing tasks (Mano, 1993,
p. 2).
20. Hardware is used to refer to the specifics of a machine, including the detailed logic design and the
packaging technology of the machine (Hennessy & Patterson, 2003, p. 9).
21. Programs written in machine code or assembler language are useful only for a specific type of
computer, for they describe the required computational steps in terms of the instruction set ap-
plicable to a specific machine. Conversely, the hardware structure of a machine is determined by
the instruction set which it must be able to execute. The software and hardware thus affect each

PY
other through the instruction set and through the register configuration which the machine makes
available to the programmer (Silvester & Lowther, 1989, p. 267).
22. Hardware requires manufacture, whereas software does not (Blum, 1992, p. 28).
O
23. Hardware has physical models to use in evaluating design decisions, whereas software does not
(Blum, 1992, p. 29).
C
24. Hardware, because of its physical limitations, has a practical bound on complexity, whereas soft-
ware does not (Blum, 1992, p. 29).
25. Hardware is composed of parts that wear out, whereas software does not degrade from use (Blum,
1992, p. 29).
R

26. Software systems are abstract and intangible. They are not constrained by the properties of material,
governed by physical laws, or by manufacturing processes (Sommerville, 2011, p. 4).
O

27. Many people think that software is another word for computer programs. However, when we are
talking about software engineering, software is not just the programs themselves but also all as-
TH

sociated documentation and configuration data that is required to make these programs operate
correctly (Sommerville, 2011, p. 5).
28. Software is an integration of the program code with design and supporting documentations and
AU

intermediate work products, such as original requirements, system specification, system design
and decision rationales, code configuration, test cases and results, maintenance mechanisms, and
user manuals (Wang, 2008, p. 15).
29. Software is an intellectual artifact that provides a solution for a repeatable computing application,
which enables existing tasks to be done easier, faster, and smarter, or which provides innovative
applications for the industries and everyday life (Wang, 2008, p. 15).
30. Software is a specific solution for computing in order to implement a certain architecture and
obtain a set of expected behaviors on a universal computer platform for a required application
(Wang, 2008, p. 17).
As a final note, I realize that this list is not representative of all discussions (and definitions) per-
taining to hardware and software. There are simply too many sources to permit such an undertaking.
However, these items are taken from sources concerned both with very general (e.g., the Oxford English
Dictionary) and domain specific (e.g., the Dictionary of Computing) aspects of hardware and software,
including a number of textbooks from the sub-disciplines of computer science: computer organization,
32 W.D. Duncan / Ontological distinctions between hardware and software

computer architecture, and software engineering. Thus, they represent a wide range of ideas concerning
the nature of hardware and software, which was the main intent of compiling this list.

References

Arp, R., Smith, B. & Spear, A.D. (2015). Building Ontologies with Basic Formal Ontology. Cambridge, MA: MIT Press.
Baker, L.R. (2007). The Metaphysics of Everyday Life: An Essay in Practical Realism. Cambridge: Cambridge University
Press.
Blum, B.I. (1992). Software Engineering, a Holistic View. New York, Oxford: Oxford Univesity Press.
Chandor, A., Graham, J. & Williamson, R. (1970). A Dictionary of Computers. Baltimore: Penguin Books.
Colburn, T.R. (1999). Software, abstraction, and ontology. The Monist, 3–19. doi:10.5840/monist19998215.
Dipert, R.R. (1993). Artifacts, Art Works, and Agency. Philadelphia, PA: Temple University Press.
Eden, A.H. & Turner, R. (2007). Problems in the ontology of computer programs. Applied Ontology, 2(1), 13–36.
Hellerman, H. (2000). Computer system. In A. Ralston, E.D. Reilly and D. Hemmendinger (Eds.), Encyclopedia of Computer
Science (4th ed.). London: Nature Publishing.
Hennessy, J.L. & Patterson, D.A. (2003). Computer Architecture: A Quantitative Approach (3rd ed.). San Francisco: Morgan
Kaufmann.

PY
Houkes, W. & Vermaas, P. (2010). Technical Functions: On the Use and Design of Artifacts. New York: Springer.
Irmak, N. (2012). Software is an abstract artifact. Grazer Philosophische Studien, 86(1), 55–72.
Kripke, S. (1982). Wittgenstein on Rules and Private Language. Cambridge, MA: Harvard University Press.
Lando, P., Lapujade, A., Kassel, G. & Fürst, F. (2009). An ontological investigation in the field of computer programs. In
O
J. Filipe, B. Shishkov, M. Helfert and L.A. Maciaszek (Eds.), Software and Data Technologies (pp. 371–383). Berlin,
Heidelberg: Springer.
Malone, J., Brown, A., Lister, A.L., Ison, J., Hull, D., Parkinson, H. & Stevens, R. (2014). The Software Ontology (SWO): A re-
C
source for reproducibility in biomedical data analysis, curation and digital preservation. Journal of Biomedical Semantics,
5(25).
Mano, M.M. (1993). Computer System Architecture (3rd ed.). Englewood Cliffs, NJ: Prentice-Hall.
Millikan, R.G. (1999). Wings, spoons, pills, and quills: A pluralist theory of function. The Journal of Philosophy, 96(4), 191–
R

206.
Moor, J.H. (1978). Three myths of computer science. The British Journal for the Philosophy of Science, 29(3), 213–222. doi:10.
1093/bjps/29.3.213.
O

Oberle, D., Grimm, S. & Staab, S. (2009). An ontology for software. In S. Staab and R. Studer (Eds.), Handbook on Ontologies
(pp. 383–402). Berlin: Springer. doi:10.1007/978-3-540-92673-3_17.
Rosen, S. (2000). Software. In A. Ralston, E.D. Reilly and D. Hemmendinger (Eds.), Encyclopedia of Computer Science (4th
TH

ed.). London: Nature Publishing.


Silberschatz, A., Galvin, P.B. & Gagne, G. (2002). Operating System Concepts (6th ed.). New York: Wiley.
Silvester, P.P. & Lowther, D.A. (1989). Computer Engineering: Circuits, Programs, and Data. New York, Oxford: Oxford
University Press.
AU

Smith, B., Almeida, M., Bona, J., Brochhausen, M., Ceusters, W. & Courtot, M., et al. (2013). Basic Formal Ontology 2.0.
Retrieved from http://purl.obolibrary.org/obo/bfo/Reference.
Sommerville, I. (2011). Software Engineering (9th ed.). Boston: Addison-Wesley.
Stallings, W. (2010). Computer Organization and Architecture: Designing for Performance (8th ed.). Upper Saddle River, NJ:
Prentice-Hall.
Suber, P. (1988). What is software? The Journal of Speculative Philosophy, 2(2), 89–119.
Tanenbaum, A.S. (1990). Structured Computer Organization (3rd ed.). Englewood Cliffs, NJ: Prentice-Hall.
Tummala, R.R. (2001). Fundamentals of Microsystems Packaging. New York: McGraw-Hill.
Turner, R. (2010). Programming languages as mathematical theories. In J. Vallverdú (Ed.), Thinking Machines and the Philoso-
phy of Computer Science: Concepts and Principles (pp. 66–82). Hershey, PA: Information Science Reference. doi:10.4018/
978-1-61692-014-2.ch005.
Turner, R. (2011). Specification. Minds and Machines, 21(2), 135–152. doi:10.1007/s11023-011-9239-x.
Wang, X., Guarino, N., Guizzardi, G. & Mylopoulos, J. (2014). Towards an ontology of software: A requirements engineering
perspective. In Paper Presented at the 8th International Conference on Formal Ontology in Information Systems, Rio de
Janeiro, Brazil.
Wang, Y. (2008). Software Engineering Perspectives: A Software Science Perspective. Boca Raton, FL: Auerbach Publications.

View publication stats

You might also like