Professional Documents
Culture Documents
ML - Article
ML - Article
Moral Robots
Making sense of robot ethics
HOME
HIGHLIGHTS
VIDEO LECTURES
AI AND SOCIETY
OPINION
INTERESTS
RESOURCES
ABOUT
Search for:
Search …
More fun with finite state machines
February 14, 2019andyAI and Society, Coder's corner, Feature, Philosophy-AI-Book
This is the second part of a post on finite state machines programmed in Prolog.
The first part is here. A short intro to Prolog is here. Have fun!
Contents [hide]
We begin with a very simple FSM that (like the first example above) describes only
the eating behaviour of the dog. But now we will do it in actual code. We will do
this example in Prolog, because it’s relatively easy to read, even for non-
programmers, and its expressiveness makes implementing finite state machines
particularly easy.
Our first FSM will not have any entry or exit actions, just two states and their
transitions. For a basic introduction to Prolog, see the post here.
Swish is a fully working Prolog environment that you can use directly in your
browser without installing anything or needing to create any account. Just go to
that page and copy the code below into the window on the left. Ignore the line
numbers; they are just here for the purposes of discussion and they don’t belong to
the program itself. Any text between /* and */ is a comment for the reader. It is
entirely ignored by the Prolog interpreter.
next( State ) :-
state( State, Name ),
print( Name ),
read( Cond ),
transition( State, Next, Cond ),
next( Next ).
next( State ) :-
/* This is called if the transition fails. Don't do anything. */
print("No transition for that verb."), nl,
next( State ).
start( Startstate ) :-
next( Startstate ).
Let’s now look at this program in a little more detail.
In the first four lines, we describe the states (hungry, full) and their
transitions. Every transition takes three arguments: the state it transitions from,
the state it transitions to, and the transition condition that will trigger this
transition.
So:
next( State ) :-
state( State, Name ),
print( Name ),
read( Cond ),
transition( State, Next, Cond ),
next( Next ).
This is the main predicate that drives the state changes. For each state ‘State’
(remember, in Prolog capitalised words are variables!) this calculates the next
state of the state machine. The next state for state ‘State’ is computed thus:
Go into state ‘State’ and get its name into the variable ‘Name.’
Print the name of the current state ‘State’ so that the user can see where we are.
Read the user’s input from the keyboard. What the user enters will be the
transition condition out of the present state ‘State.’ So the user is expected to
type in something like ‘eat’ or ‘wait,’ which are our transition conditions.
Then find a suitable transition. Prolog will try to match the ‘transition’
predicate in the fifth line (above) with the known transitions we defined at the
beginning of our program. If it can find a transition from State to Next via the
condition Cond, it will put the name of the next state into the variable Next
(which up to now has been unused, so it doesn’t have any value). If there was a
transition that corresponded to the user’s entered transition condition Cond, now
Next will contain the new state.
In the last step we go to that, new state by calling the next() predicate again,
but now we replace State with Next.
If there was no suitable next state for the transition condition entered by the
user, the program will fail at this point and jump to the next ‘next()’ predicate
(see line 13 in the code above) which will do nothing except outputting a warning
message. So we will stay in the same state and do the same thing all over again.
Related: Deep Convolutional Generative Adversarial Network (DCGAN) Tutorial
Use this program by querying:
?- start( hungry ).
and then try to enter a few transition conditions when prompted, both valid ones
(eat, wait) and invalid ones (bark). You should now understand how the system
behaves.
next( State ) :-
state( State, Name, Action ),
print( Name ), nl,
print( Action ), nl,
read( Cond ),
transition( State, Next, Cond ),
next( Next ).
next( State ) :-
/* This is called if the transition fails. Don't do anything. */
print("No transition for that verb."), nl,
next( State ).
start( Startstate ) :-
print("Toy dog simulator."), nl,
next( Startstate ).
Use by querying:
?- start( hungry ).
One big difference is that now we also have entry actions:
A final point: the underscore “_” character in Prolog will match anything. So the
first line in this snippet:
Of course, after having this line, we don’t need the previous one any more, because
the underscore for the transition condition will also match the ‘bark’ in the first
line. Thus the first of these two lines is now redundant. Equally, the error
condition (second next() predicate) is now never reached, because all transitions
will succeed. Whatever the user enters, will successfully trigger a transition
instead of causing an error, because any user input that is not understood will
just lead to the fall-back state of barking.
Essentially, an FSM expresses a big and convoluted set of conditions. If the state
is S1 and condition C1 is fulfilled, then perform the entry action A2 of state S2.
If, instead the state is S2 and condition C2 is present, perform A3. And so on.
There is no technical reason not to use a big conditional statement instead of an
FSM. But there is a good reason that has to do with how human programmers can keep
track of all the possible conditions and actions.
Finite State Machines are an organisational principle that we use to analyse and
describe complex behaviour in a way that will reduce its apparent complexity by
dividing up into distinct “groups of behaviour” that we call states and that can be
treated individually. In this way, we can work our way through implementing the
complex behaviour in software state by state, without being overwhelmed by the
overall complexity of the complete system.
What does all this mean for the programmer? As long as your simulation only reacts
to individual events and does not need to behave in different ways depending on the
history of the system and its context, a state machine will not provide any
advantage. But if you want to model different responses to the same input,
depending on the context of the interaction, then finite state machines can be very
helpful.
I hope you had fun playing with this program. You surely can find infinite ways to
extend this little example and give our dog even more interesting behaviours. One
could, for example, try to model the dog’s mental states of trust or fear of the
user, its reaction to cats and other environmental factors, or its food-searching
behaviour.
Related Posts
Fun with finite state machines!
Fun with finite state machines!
Symbolic AI and Prolog (video) [AI and Society 05a]
Symbolic AI and Prolog (video) [AI and Society 05a]
GOFAI, Eliza, and SHRDLU: Symbols and Meaning
GOFAI, Eliza, and SHRDLU: Symbols and Meaning
Alan Turing (1912-1954)
Alan Turing (1912-1954)
Deep Learning CNN’s in Tensorflow with GPUs
Deep Learning CNN’s in Tensorflow with GPUs
Expert systems (video) [AI and Society 06b]
Expert systems (video) [AI and Society 06b]
Share on FacebookShare on TwitterShare on LinkedinShare on PinterestShare on Xing
Related
Fun with finite state machines!
February 12, 2019
In "AI and Society"
Prolog: Programming in Logic
September 13, 2017
In "AI and Society"
Can machines have common sense?
January 12, 2019
In "AI and Society"
Tagged code finite state machines games introduction programming Prolog tutorial
Post navigation
Fun with finite state machines!History and Claims of AI (video) [AI and Society
03a]
STAY UP TO DATE!
email address
Find us on:
View moralrobots’s profile on FacebookView @MoralRobots’s profile on Twitter
LATEST POSTS
Expert systems (video) [AI and Society 06b] September 26, 2019
What is symbolic AI? (video) [AI and Society 06a] September 26, 2019
Defeasible logic in AI (video) [AI and Society 05b] September 17, 2019
Symbolic AI and Prolog (video) [AI and Society 05a] September 17, 2019
Definitions of AI (video) [AI and Society 04b] September 11, 2019
What is intelligence? (video) [AI and Society 04a] September 11, 2019
Claims of AI (video) [AI and Society 03b] September 9, 2019
History and Claims of AI (video) [AI and Society 03a] September 5, 2019
More fun with finite state machines February 14, 2019
Fun with finite state machines! February 12, 2019
CATEGORIES
About
Activism
AI and Society
AIS videos
Briefing
Case studies
Coder's corner
Feature
News
Newsletters
Opinion
Philosophy
Philosophy-AI-Book
Press
Research corner
Resources
Technology
Uncategorized
TAG
LATEST
COMMENTS
amazon applications art artificial intelligence autonomous cars biohacking brain-
computer-interface code creativity culture deep learning deepmind ethics ethics
problems facebook games google hardware healthcare human obsolescence industry
introduction jobs law machine learning machine morality machine vision medicine
natural language processing neural networks personal assistants philosophy privacy
python reading list research robots social robots society surveillance
technoregulation tensorflow tools tutorial war robots
Find us on:
View moralrobots’s profile on FacebookView @MoralRobots’s profile on Twitter
Proudly powered by WordPress | Theme: NewsAnchor by aThemes.