Full download Brain and Nature-inspired Learning, Computation and Recognition 1st Edition Licheng Jiao file pdf all chapter on 2024

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 44

Brain and Nature-inspired Learning,

Computation and Recognition 1st


Edition Licheng Jiao
Visit to download the full and correct content document:
https://ebookmass.com/product/brain-and-nature-inspired-learning-computation-and-r
ecognition-1st-edition-licheng-jiao/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

The Nature of Physical Computation Oron Shagrir

https://ebookmass.com/product/the-nature-of-physical-computation-
oron-shagrir/

Medical image recognition, segmentation and parsing :


machine learning and multiple object approaches 1st
Edition Zhou

https://ebookmass.com/product/medical-image-recognition-
segmentation-and-parsing-machine-learning-and-multiple-object-
approaches-1st-edition-zhou/

Advances in Additive Manufacturing: Artificial


Intelligence, Nature-Inspired, and Biomanufacturing
Ajay Kumar

https://ebookmass.com/product/advances-in-additive-manufacturing-
artificial-intelligence-nature-inspired-and-biomanufacturing-
ajay-kumar/

Brain Plasticity and Learning: Implications for


Educational Practice 1st Edition Jennifer Anne Hawkins

https://ebookmass.com/product/brain-plasticity-and-learning-
implications-for-educational-practice-1st-edition-jennifer-anne-
hawkins/
Supramolecular Gels 1st Edition Tifeng Jiao

https://ebookmass.com/product/supramolecular-gels-1st-edition-
tifeng-jiao/

Computation in BioInformatics S. Balamurugan

https://ebookmass.com/product/computation-in-bioinformatics-s-
balamurugan/

Nature-Inspired Computing Paradigms in Systems:


Reliability, Availability, Maintainability, Safety and
Cost (RAMS+C) and Prognostics and Health Management
(PHM) 1st Edition Mohamed Arezki Mellal And Michael G.
Pecht
https://ebookmass.com/product/nature-inspired-computing-
paradigms-in-systems-reliability-availability-maintainability-
safety-and-cost-ramsc-and-prognostics-and-health-management-
phm-1st-edition-mohamed-arezki-mellal-and-m/

Portfolio Optimization with Different Information Flow


1st Edition Edition Hillairet Caroline And Jiao Ying
(Auth.)

https://ebookmass.com/product/portfolio-optimization-with-
different-information-flow-1st-edition-edition-hillairet-
caroline-and-jiao-ying-auth/

Nature Swapped and Nature Lost: Biodiversity


Offsetting, Urbanization and Social Justice 1st ed.
Edition Elia Apostolopoulou

https://ebookmass.com/product/nature-swapped-and-nature-lost-
biodiversity-offsetting-urbanization-and-social-justice-1st-ed-
edition-elia-apostolopoulou/
Brain and Nature-Inspired
Learning, Computation and
Recognition
Licheng Jiao
Xidian University, Xi’an, China
Ronghua Shang
Xidian University, Xi’an, China
Fang Liu
Xidian University, Xi’an, China
Weitong Zhang
Xidian University, Xi’an, China
Elsevier
Radarweg 29, PO Box 211, 1000 AE Amsterdam, Netherlands
The Boulevard, Langford Lane, Kidlington, Oxford OX5 1GB, United Kingdom
50 Hampshire Street, 5th Floor, Cambridge, MA 02139, United States
Copyright © 2020 Tsinghua University Press. Published by Elsevier Inc. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or by any means, electronic or
mechanical, including photocopying, recording, or any information storage and retrieval system, without
permission in writing from the publisher. Details on how to seek permission, further information about the
Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance
Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions.
This book and the individual contributions contained in it are protected under copyright by the Publisher
(other than as may be noted herein).
Notices
Knowledge and best practice in this field are constantly changing. As new research and experience broaden
our understanding, changes in research methods, professional practices, or medical treatment may become
necessary.
Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using
any information, methods, compounds, or experiments described herein. In using such information or
methods they should be mindful of their own safety and the safety of others, including parties for whom they
have a professional responsibility.
To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any
liability for any injury and/or damage to persons or property as a matter of products liability, negligence or
otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the
material herein.

Library of Congress Cataloging-in-Publication Data


A catalog record for this book is available from the Library of Congress
British Library Cataloguing-in-Publication Data
A catalogue record for this book is available from the British Library
ISBN: 978-0-12-819795-0

For information on all Elsevier publications visit our website at


https://www.elsevier.com/books-and-journals

Publisher: Matthew Deans


Acquisition Editor: Glyn Jones
Editorial Project Manager: Naomi Robertson
Production Project Manager: Sruthi Satheesh
Cover Designer: Greg Harris
Typeset by TNQ Technologies
CHAPTER 1

Introduction
Chapter Outline
1.1 A brief introduction to the neural network 1
1.1.1 The development of neural networks 2
1.1.2 Neuron and feedforward neural network 3
1.1.3 Backpropagation algorithm 9
1.1.4 The learning paradigm of neural networks 11
1.2 Natural inspired computation 12
1.2.1 Fundamentals of nature-inspired computation 12
1.2.2 Evolutionary algorithm 12
1.2.3 Artificial immune system (AIS) 15
1.2.4 Other methods 16
1.3 Machine learning 18
1.3.1 Development of machine learning 18
1.3.2 Dimensionality reduction 20
1.3.3 Sparseness and low-rank 20
1.3.4 Semisupervised learning 22
1.4 Compressive sensing learning 24
1.4.1 The development of compressive sensing 24
1.4.2 Sparse representation 25
1.4.3 Compressive observation 26
1.4.4 Sparse reconstruction 26
1.5 Applications 27
1.5.1 Community detection 27
1.5.2 Capacitated arc routing optimization 29
1.5.3 Synthetic aperture radar image processing 32
1.5.4 Hyperspectral image processing 36
References 39

1.1 A brief introduction to the neural network


Over the years, scientists have been exploring the secrets of the human brain from various
perspectives, such as medicine, biology, physiology, philosophy, computer science,
cognition, and organization synergetics, hoping to make artificial neurons that simulate the
human brain. In the process of research, in recent years, a new multidisciplinary

Brain and Nature-Inspired Learning, Computation and Recognition. https://doi.org/10.1016/B978-0-12-819795-0.00001-3


Copyright © 2020 Tsinghua University Press.
Published by Elsevier Inc. All rights reserved. 1
2 Chapter 1

cross-technology field has been formed, called “artificial neural network” The research
into neural networks involves a wide range of disciplines, which combine, infiltrate, and
promote each other.
Artificial neural network (ANN) is an adaptive nonlinear dynamic system composed of a
large number of simple basic elementsdneurons. The structure and function of each
neuron are relatively simple, but the system behavior produced by a large number of
neuron combinations is very complex. The basic structure of an artificial neural network
mimics the human brain, and reflects some basic characteristics of human brain function.
It can adapt itself to the environment, summarize rules, and complete some operations,
recognition, or process control. Artificial neural networks have the characteristics of
parallel processing, which can greatly improve work speed.

1.1.1 The development of neural networks

The development of artificial neural networks has gone through three climaxes: control
theory from the 1940 to 1960s [1-3], connectionism from the 1980s to the mid-1990s
[4, 5], and deep learning since 2006 [6, 7].
In 1943, Warren McCulloch and Walter Pitts based on a mathematical algorithm
threshold logic algorithm created a neural network model [8]. This linear model
identifies two different types of inputs by testing whether the response output is positive
or negative. The study of neural networks is divided into the study of biological
processes in the brain and the study of artificial intelligence (artificial neural networks).
In 1949, Hebb published Organization of Behavior, and put forward the famous “Hebb
theory” [2] Hebb theory mainly argues that when the axons of neuron A are close to
neuron B and neuron A participates in the repeated and sustained excitement of neuron
B, both the neurons or one of them will change the process of growth or metabolism,
which can enhance the effectiveness of neuron A stimulating neuron B [9]. Hebb theory
was confirmed by Nobel Prize winner Kendall and his animal experiments in 2000 [10].
The later unsupervised machine learning algorithms are the variants of Hebb theory more
or less. In 1958, Frank Rosenblatt simulated a neural network model called the
“perceptron” which was invented on an IBM-704 computer [11]. This model can perform
some simple visual processing tasks. Rosenblatt believed that the perceptron would
eventually be able to learn, make decisions, and translate languages. In 1959, another
two American engineers, Widrow and Hoff [12], put forward the adaptive linear element
(Adaline). This was a change from the perceptron and one of the progenitor models of
machine learning. The main difference between it and perceptron was that the Adaline
neuron has a linear activation function, which allows the output to be of any value. In
Introduction 3

1969, Marvin Minsky and Seymour Papert found two major defects in the neural
network: first, the basic perceptron could not handle XOR problems [13]. Second, the
computing power of the computer was not sufficient to deal with the large neural
network. The study of neural networks was stagnant. In 1974, Paul Werbos proposed that
the multilayer perceptron be trained by a “back propagation algorithm” to overcome the
defects that resulted in the single-layer perceptron being unable to deal with an XOR
problem [14]. However, because neural network research was at a low level at that time,
this method did not attract much attention.
The neural network idea began to revived in the 1980s. In 1982, Hopfield [15] proposed
a novel neural network called the “Hopfield network.” The Hopfield neural network was
a kind of recurrent neural network, which combines a storage system and a binary
system. It introduced the concept of energy function for the first time so that the
equilibrium state of the neural network had a clear criterion method. But due to the
limitations of computing, for the rest of the 20th century, the popularity of support vector
machines and other simpler algorithms, such as linear classifiers, gradually exceeded the
neural network.
In 1998, LeCun proposed convolutional neural networks, called LeNet-5, which were
updated by back propagation and this method achieved a good result in a handwritten
digits database [16]. In the early 21st century, the computing power of the computer was
greatly improved with the help of GPU and distributed computing. The neural network has
since gained great attention and development. In 2006, Geoffrey Hinton [17] effectively
trained a deep belief network with greedy layer-wise pretraining. This technique was then
extended to many different neural networks by researchers, greatly improving the
generalization effect of the model on the test set. In 2012, Hinton’s group won Image Net
2012 [18]. Their image classification accuracy rate was far more than the second place.
The deep neural network algorithm has a great advantage over the traditional algorithm in
some areas.
In 2016, Alpha Go [19], an artificial intelligent software which was developed by Google
Deep Mind, beat the human top professional chess player. Its principle was to use a
Monte Carlo tree search method combined with two different deep neural networks. The
emergence of Alpha Go once again pushed the development of neural networks to the
peak.

1.1.2 Neuron and feedforward neural network

A neuron is a biological model based on the nerve cells of a biological nervous system. In
the study of biological nervous systems, the biological mechanism of the neuron can be
4 Chapter 1

represented by mathematics and a computational model based on the neuron obtained. The
neurons contain three parts: cell body, dendrites, and axons. The cell body is complexly
formed by many molecules. It is the energy supply area of neuronal activity, where
metabolic activities such as metabolism are carried out. Dendrites are the entry to receive
information from other neurons. Axons are the outlets for stimulating neurons to transmit
information. The synapse is the structure that enables communication between one neuron
and another and transmits information between them.
Neural networks are described on the basis of the mathematical model of neurons. The
model is represented by network topology, node characteristics, and learning rules. The
main advantages of neural networks are as follows:
(1) Parallel distribution processing
(2) High robustness and fault tolerance
(3) Distributed storage and learning ability
(4) The ability to fully approach the complex nonlinear relationship.
According to the characteristics of the neurons and the biological function, it is known
that the neuron is a processing unit of information with multiple inputs and a single
output. The processing of information is nonlinear, and we abstract it into a simple
mathematical model, as shown in Fig. 1.1.
The specific mathematical formulas are as follows:
8 Xm
<v ¼ xi wi þ b
(1.1)
: i¼1
y ¼ 4ðvÞ

Weights
Inputs
x w biases
b
Activation
function

x w
∑ ϕ y
. .
. output
. sum
. .

xm wm

Figure 1.1
The mathematical model of neurons.
Introduction 5

Typical activation functions include sigmoid function, tanh function, radial basis
function, wavelet function, ReLU function, softplus function, etc. The corresponding
formula is (1.2).
8
>
> 1
>
> sigmoidðxÞ ¼
>
> 1 þ ex
>
>
>
>
< ex  ex
tanhðxÞ ¼ x
> e þ ex (1.2)
>
>
>
> ReLUðxÞ ¼ maxð0; xÞ
>
>
>
>
>
: softplusðxÞ ¼ logð1 þ ex Þ

Neuroscientists have found that neurons have the characteristics of unilateral inhibition,
wide excited boundary, sparse activation, and so on. Compared with other activation
functions, the rectified linear unit function (ReLU) has biological interpretability. In
addition, the derivative of the softplus function is the logistics function. This is the smooth
form of a rectified linear unit. Although it also has unilateral inhibition and wide
excitability boundary characteristics, it has no characteristics of sparse activation.
Based on mathematical neuron model, neural networks can be divided into forward
networks (directed acyclic) and feedback networks (undirected complete graph, also called
cyclic network) according to the topology of the network connection. For the feedback
network, the stability of the network model is closely related to the associative memory.
The Hopfield network and the Boltzmann machine are of this type. The forward network
can be realized by the multiple compound of simple nonlinear function. The network
structure is very simple. The following is an introduction to the forward neural network.
Its network structure is shown in Fig. 1.2.
The corresponding mathematical formula is (1.3).

Hidden
Input
Output

Figure 1.2
Feedforward neural network with a single hidden layer.
6 Chapter 1
8 !
> Xm
ð1Þ
>
>
ð1Þ
h ¼4 ð1Þ
xi ,wi þ b ð1Þ
>
>
< i¼1
0 1 (1.3)
>
> Xn
>
> ð1Þ ð2Þ
: y¼4 @ hj ,wj þ bð2Þ A
ð2Þ
>
j¼1

where the input x˛ℝm, the hidden layer h˛ℝn, and the output y˛ℝK. w(1)˛ℝmn and
b(1)˛ℝn are the weight connection matrix and bias from the input layer to the hidden layer,
respectively. w(2)˛ℝnK and b(2)˛ℝK are the weight connection matrix and bias from the
hidden layer to the output layer. 4(1) and 4(2)are the activation function. In practical
applications, the training data set is assumed to be
8n oN
>
> x ðnÞ ðnÞ
; y
< n¼1
ðnÞ (1.4)
>
> x ˛ ℝ m
: ðnÞ
y ˛ ℝK
The model between the input and output is Formula (1.5).
0 ! 1
X
n Xm
ð1Þ ð2Þ
y ¼ Tðx; qÞ ¼ 4ð2Þ @ 4ð1Þ xi , wi þ bð1Þ wj þ bð2Þ A (1.5)
j¼1 i¼1

The parameter q¼(w(1),b(1);w(2),b(2)) is further optimized for the target (the loss term and
the regular term composition).
N   2 2  
1 X  ðnÞ  X  ðlÞ 2
min LðqÞ ¼ y  T xðnÞ ; q  þ l w  (1.6)
q N n¼1 F F
l¼1

The gradient descent method is used to solve the parameter q.


8 k
< q ¼ q a,Vqjq¼qk1
k1
>


> Vqj k1 ¼ vLðqÞ
(1.7)
: q¼q vq q¼qk1

With the increase in the iteration number k, the parameters will converge (indirectly
through the target function L(qk) to visualize the observation).
limqk ¼ q (1.8)
k/N

The reason for convergence is that the above objective function is convex. To optimize the
objective function, it can be directly solved by the closed form solution. However, when
the amount of data is large, storage and reading will be very time-consuming. Therefore, it
Introduction 7

is usually solved by random gradient descent (with batch processing). For the neural
networks with determined topology structure, Hornik et al. [20e22] proved that if the
output layer adopts linear activation function and the hidden layer adopts sigmoid
function, the single hidden layer neural network can approximate any rational function
with any accuracy.
When the number of layers of the network is more than one, it is called a multi hidden
layer feedforward neural network, or a deep feedforward neural network. Its structure is
shown in Fig. 1.3.
The topology of the deep feedforward neural network is the multi hidden layer, the full
connection, and the directed acyclic. Using the following notation, the model between the
input and output of the network is given.
The input layer is x˛ℝm, the output layer is y˛ℝs, and the output of the hidden layer is
written as
8 !
X
nl1
>
> ðlÞ
h ¼4 ðlÞ ðl1Þ ðlÞ
hi wi þ b ðlÞ
>
>
>
< i¼1

> l ¼ 1; 2; .L (1.9)
>
>
> hð0Þ ¼ x
>
:
hðLÞ ¼ y
Removing the input layer h(0)and the output layer h(L), the number of hidden layers is L1,
and the corresponding hyperparameters (the number of layers, the number of hidden units,
the activation function) are represented as:
8
>
> L þ 1/the number of layers
>
<
½n0 ; n1 ; n2 ; .; nL1 ; nL /dimensions of each layer (1.10)
> h i
>
> ð1Þ ð2Þ ðL1Þ ðLÞ
: 4 ; 4 ; .; 4 ;4 /activation function

where n0 ¼ m and nL ¼ s. The parameters to be learned are represented as

Hidden Layers
Inputs
Outputs

Figure 1.3
Feedforward neural network with multi hidden layers.
8 Chapter 1
8
> q ¼ ðq1 ; q2 ; .; qL Þ
<  
ql ¼ wðlÞ ˛ ℝnl1 nl ; bðlÞ ˛ ℝnl (1.11)
>
:
l ¼ 1; 2; .; L
The relationship between input and output is represented as
!
XnL  
ðL1Þ ðLÞ
y ¼ hðLÞ ¼ 4ðLÞ hiL ,wiL þ bðLÞ /writteng 4ðLÞ hðL1Þ ; qL
iL ¼1
! !
X
nL X
nL1
ðL2Þ ðL1Þ ðLÞ
ðLÞ ðL1Þ ðL1Þ ðLÞ
¼4 4 hiL1 ,wiL1 þb wiL þb
iL ¼1 iL1 ¼1
    (1.12)
ðLÞ ðL1Þ ðL2Þ
/written 4 4 h ; qL1 ; qL

¼/
   
¼ 4ðLÞ 4ðL1Þ /4ð1Þ ðx; q1 Þ/; qL1 ; qL /written f ðx; qÞ

In practical applications, the training data set is assumed to be


8n oN
>
> x ðnÞ ðnÞ
; y
< n¼1
(1.13)
>
> xðnÞ ˛ ℝm
: ðnÞ
y ˛ ℝs
The optimized objective function (the loss term and the regular term) is as follows:
min JðqÞ ¼ LðqÞ þ lRðqÞ (1.14)
q

where b
y n ¼ f ðxn ; qÞ and
8
>    2
>
>
> y n ¼ yn  b
l yn ; b y n F
>
>
>
>
>
> XN  
< LðqÞ ¼ 1 l yn ; b
yn
N n¼1 (1.15)
>
>
>
>
>
> X L XL  
>
> 2  ðLÞ 2
>
> RðqÞ ¼ kql k ¼ w 
: l¼1
F
l¼1
F

There are many forms of loss function l(,): energy function, cross entropy loss, and the
regularization term R(,) includes Frobenius norm (preventing overfitting), and sparse
regularization (simulating biological response characteristics).
Introduction 9

1.1.3 Backpropagation algorithm

In order to optimize the objective function Formula (1.14), first, we must determine
the convexity and nonconvexity of the function (Fig. 1.4). If the feasible region is
the convex set, the convex function defined on the convex set is convex
optimization. And the obtained solution does not depend on the selection of initial
value and is the global optimal solution. Usually, the optimization objective function
of a deep feedforward neural network is nonconvex, therefore the solution of
parameters depends on the setting of the initial parameters (there are many saddle
points and local extreme points in the feasible region). If the setup is reasonable,
you can avoid falling into the local optimal. In order to illustrate the
backpropagation algorithm (based on the gradient descent method), the following
method is described to update the parameters.
8
> ðkÞ ðk1Þ
<q ¼ q a,Vqjq¼qðkÞ
(1.16)
: Vqj ðkÞ ¼ vLðqÞ þ l vRðqÞ
>
q¼q vq vq
where a is the learning rate, and the specific parameters on each layer are updated as
8
>
>
ðkÞ
q ¼ ql
ðk1Þ
a,Vql jq ¼qðk1Þ
>
< l l l
 
> 
vLðqÞ vRðqÞ (1.17)
>
> Vq j ðk1Þ ¼ þ l
: vql ql ¼qðk1Þ vql ql ¼qðk1Þ
l ql ¼ql
l l

Backward

Hidden Layers
Inputs
Outputs

Forward
Figure 1.4
An illustration of the backpropagation algorithm.
10 Chapter 1
ðkÞ
where ql is the value to be updated for the lth layer in the kth iteration. And the error
propagation term is introduced for the solution of the gradient descent. According to the
chain rule, it is expanded to:

vLðqÞ vhðlÞ vhðlþ1Þ vhðLÞ vLðqÞ


¼ $ . $ (1.18)
vql vql vhðlÞ vhðL1Þ vhðLÞ
The error propagation term is written as:
vLðqÞ
dðlÞ ¼ (1.19)
vhðlÞ
With the further use of ql ¼ (w(l),b(l)), the corresponding derivatives of parameters for the
hidden layer output are represented as:
8   
> ðlÞ l1 T ðlÞ ðlÞ  ’
>
> vh ðlÞ v4 h $w þ b
>
> ¼ ¼ h l1
4ðlÞ
< ðlÞ ðlÞ
$
vw vw
   (1.20)
>
> ðlÞ l1 T ðlÞ ðlÞ
>
> vhðlÞ v4 h $w þ b  
>
: ¼ ¼ 1$ 4 ðlÞ

vbðlÞ vbðlÞ
where “$” is the Hadamard product. Formula (1.17) is the derivatives of the parameters on
the loss term, and the derivatives of the regular term are:
!
vRðqÞ v X L
vkql k2F
¼ kql k2F ¼ (1.21)
vql vql l¼1 vql

Usually, the constraints in the regular term are only for the weight matrix, and the bias is
not regular, so there are:
8  
 ðlÞ 2
>
> vRðqÞ vw 
>
>
>
< ðlÞ
¼ ðlÞ
F
¼ 2wðlÞ
vw vw
 2 (1.22)
>
>  
>
> ðlÞ
vw 
>
: vRðqÞ ¼
ðlÞ ðlÞ
F
¼0
vb vb
The process of optimizing the parameter ql of the lth hidden layer is mainly determined by
the gradient (first derivative) of loss item L(q) and regular item R(q) to the parameter ql.
The error propagating is realized by introducing the error propagation term [Formula
(1.19)]. Training of the feedforward neural network is divided into two steps. The first is
to calculate the output value of each layer in the forward propagation process according to
the current parameter value. The second is to backpropagate the error item of each layer
according to the difference between the actual output and the expected output.
Introduction 11

The partial derivatives of each layer’s output are combined to update the parameters.
Repeat the two steps until the network converges. When the network’s layer is deep, the
gradient error of parameters on each layer will gradually decrease from the output to the
input. (When it is closer to the output, the decline is greater. When it is closer to the input,
the decline is smaller and may be zero.). This makes the whole network difficult to obtain
better parameters by training. This phenomenon rejects global minima and saddle points of
the feasible region and makes object function tend to fall into the local optimum. This is
the vanishing gradient problem.

1.1.4 The learning paradigm of neural networks

The basic neural network still uses the paradigm of machine learning, that is, data, model,
optimization, and solving four parts. Machine learning emphasizes learning data features
based on the prior (including extracting and screening feature to get the discriminable
feature) and the classifier design. But model expression ability is limited by the
characteristics of learning. The advantages are that it can quickly optimize the objective
function by using a convex optimization algorithm or software. Its core is the pursuit of
speed and precision.
Compared with machine learning, a deep neural network reduces the dependence on priori
data. The representation ability of the model is increasingly deep and essential with the
deepening of layers.
(1) In the training stage, the labeled data are scarce and there are more parameters of the
model to be trained. This will lead to insufficient training or overfitting.
(2) The optimization objective is a nonconvex optimization problem. It depends on the se-
lection of the initial value. Choosing the proper initial value can avoid prematurely
falling into the local optimum and the obtained solution is close to the optimal solu-
tion. If the selection is not good, the network is prone to underfitting.
(3) When the backpropagation algorithm is used, the phenomenon of a vanishing gradient
problem can easily occur, which leads to inadequate training of the network model.
The difference in data is crucial to the deep neural network. For classification tasks,
stronger aggregation represents that the data belonging to the same class have greater
similarity. The common features are the main part, and the individual characteristics are
supplemented. The large sparsity between classes indicates that there is greater difference
between classes. That is, personalization is the main feature, and the common features are
supplemented. Using a deep neural network for feature learning, the multilevel
combination of hierarchical parameters will give the weight parameter a discriminable
characteristic. It emphasizes commonality in the class and pays attention to individuality
among the classes. The most satisfying model under the combination of parameters also
12 Chapter 1

indirectly indicates that the two factors mentioned above are contradictory and unified. In
essence, a deep neural network represents data in a hierarchical method. An advanced
representation is based on low-level representation. A complex problem is divided into a
series of nested and simple representation learning problems. For example, the first hidden
layer identifies the edge from some pixels and their adjacent pixels’ value in the image.
The second hidden layers integrate the edges to identify the outlines and corners. The third
hidden layers extract specific outlines and corners as abstract high-level semantic features.
Finally, a linear classifier is used to identify the target in the image.

1.2 Natural inspired computation


1.2.1 Fundamentals of nature-inspired computation

Bio-intelligence is a very important source of theoretical inspiration in artificial


intelligence research. From the perspective of information processing, the organism is an
excellent information processor, and its ability to solve problems through its own evolution
is also dwarfing the current best computer. In recent years, artificial intelligence
researchers have become accustomed to referring to the intelligent algorithms developed
by inspiration from natural phenomena as nature-inspired computation (NIC). Based on
the functions, characteristics, and mechanisms of organisms in nature, it studies the
abundant processing mechanisms contained in it, constructs corresponding computational
models, and designs corresponding algorithms and applies them to various fields. Natural
computing is not only a new hotspot in artificial intelligence research, but also a new way
of thinking for the development of artificial intelligence, and a new result of the
transformation of methodology. The research results include artificial neural networks,
evolutionary algorithms, artificial immune systems, fuzzy logic, quantum computing, and
complex adaptive systems, etc. Natural computing can solve many complex problems
which are difficult to solve by traditional computing methods. It has a good application
prospect in the fields of solving large-scale complex optimization problems, intelligent
control, and computer network security. This section focuses on evolutionary algorithms
and artificial immune systems.

1.2.2 Evolutionary algorithm

Evolutionary computation is a kind of adaptive artificial intelligence technique that


simulates the process and mechanism of the biological evolution to solve problems. The
core idea comes from a basic understanding that the process of evolution from simple to
complex and low level to high level is a natural, parallel, and robust optimization process.
The goal of this process is to achieve the purpose of optimization through the adaptability
of the environment, the “survival of the fittest” and genetic variation of the biological
population.
Introduction 13

Evolutionary algorithm (EA) is a kind of random search technology based on the above
ideas. They simulate the learning process of a group of individuals, each of which
represents a point in a given problem search space. The evolutionary algorithm starts from
the selected initial solution and gradually improves the current solution through an
iterative evolutionary process until the best solution or a satisfactory solution position is
found. In the course of evolution, the algorithm uses a method similar to natural selection
and sexual reproduction in a set of solutions to generate the next-generation solutions with
better performance indicators on the basis of the inherited superior genes.
The general steps for solving an optimization problem using an evolutionary algorithm
are:
(1) Give a set of initial solutions randomly;
(2) Evaluate the performance of the current set of solutions;
(3) If the current solution satisfies the requirements or the evolution process reaches a
certain algebra, the calculation will be terminated;
(4) According to the evaluation result of (2), select a certain number of solutions from the
current solutions as the objects of genetic operations;
(5) Perform genetic operations on the selected solutions, such as crossover, mutation, etc.,
to get a new set of solutions. Then go to (2).
The commonly used search methods fall into three categories: enumeration, analytics, and
randomization. Enumeration refers to enumerating all feasible solutions within a set of
feasible solutions in order to find the optimal solution. For a continuous function, it needs
to be discretized. However, many practical problems correspond to a large search space, so
the solution to this method is very inefficient. The analytical method mainly uses the
properties of the objective function in the solution process, such as the first derivative, the
second derivative, and so on. This method can be divided into two kinds of methods:
direct and indirect. The direct method determines the next search direction based on the
gradient of the objective function, so it is difficult to find the global optimal solution,
while the indirect method derives a set of equations from the necessary conditions of
extreme values, and then solves the system of equations. However, the derived equations
are generally nonlinear and their solution is very difficult. The random method introduces
random changes to the search direction during the search process, making the algorithm
jump out of the local extreme point with a greater probability during the search process.
Randomization can be further divided into blind randomization and guided randomization.
The former randomly selects different points in the feasible solution space for detection,
the latter changes the current search direction with a certain probability, and searches in
other directions.
EAs belong to a random search method, which adopt a random processing method in the
initial solution generation and the genetic operations such as selection, crossover, and
14 Chapter 1

variation. Compared with the traditional search algorithms, they have the following
differences:
(1) EAs do not act directly on the solution space, but use some kind of encoding represen-
tation of the solution.
(2) EAs start from a group of multiple points rather than one point, which is one of the
main reasons why they can find the global optimal solution with a large probability.
(3) EAs only use the adaptive information of the solution (i.e., the value of the objective
function) and weigh between increasing revenue and reducing overhead, while tradi-
tional search algorithms typically use derivatives.
(4) EAs use stochastic transition rules rather than deterministic transition rules.
In addition, the main features of EAs compared with the traditional algorithm are reflected
in the following two aspects.
Intelligence: The intelligence of EAs includes self-organization, self-adaptation, and self-
learning. When using EAs to solve the problem, the algorithm will use the information
obtained in the evolution process to self-organize the search after the coding scheme,
fitness function, and genetic operator are determined. This intelligent feature of EAs also
gives them the ability to automatically discover the characteristics and laws of the
environment based on changes in the environment.
Essential parallelism: The essential parallelism of EAs is manifested in two aspects. The
first is that EA is inherently parallel, that is, EA itself is well-suited for massive
parallelism; the second is the inherent parallelism of EA. EA uses the population method
for searching, so it can search for multiple areas within the solution space and exchange
information with each other.
The currently studied EAs are mainly divided into four types [1e11]: genetic algorithms
(GAs), evolutionary programming (EP), evolution strategy (ES), and genetic programming
(GP). The first three algorithms were developed independently of each other, and the last
is a branch developed on the basis of the genetic algorithm. Although these branches have
some subtle differences in the implementation of the algorithm, they have a common
feature, that is, they all rely on the ideas and principles of biological evolution to solve
practical problems.
Evolutionary computation is the product of multidisciplinary integration and infiltration. It
has developed into a comprehensive technology of self-organizing and self-adaption,
which has been widely used in computer science, engineering technology, management
science, and social science. At present, the research into evolutionary computation mainly
focuses on basic theory, function optimization, combinatorial optimization, classification
system, parallel evolutionary algorithm, image processing, evolutionary neural network,
and artificial life.
Introduction 15

1.2.3 Artificial immune system (AIS)

The artificial immune system (AIS) inspired by immunology is an adaptive system to solve
complex problems by simulating immune functions, principles, and models [12]. As early as
the mid-1980s, Farmer et al. [13] took the lead in providing a dynamic model of the
immune system based on the immune network theory and discussed the relationship
between the immune system and artificial intelligence methods which took up research on
artificial immune systems. However, the research findings after this are rare. Until
December 1996, on an international symposium that was held in Japan based on the
immune system, the concept of “artificial immune system” was firstly proposed.
Subsequently, the relevant research on the artificial immune system began rapidly and the
related papers and research results increased year by year. In 1997 and 1998, IEEE Systems,
Man and Cybernetics International Conference organized a related topic discussion and
established the “Artificial Immune System Memory Application Branch.” Subsequently, the
topic of artificial immune system also successively opened up on some famous international
conferences in the field of artificial intelligence, such as the International Joint Conference
on Artificial Intelligence (IJCAI), International Joint Conference on Neural Networks
(IJCNN), IEEE Congress on Evolutionary Computation (CEC), Genetic and Evolutionary
Computation Conference (GECCO), etc. Since 2002, six consecutive international
conferences on artificial immune systems have been held in the United Kingdom, Italy,
Canada, and Brazil. After more than a decade of development, the research into artificial
immune system algorithms has focused on the negative selection algorithm [14], clonal
selection algorithm [15], and immune network algorithm [16] and the research results
mainly relate to anomaly detection, computer security, data mining, and optimization, etc.
The organism is a complex large system whose information-processing function is
completed by three subsystems with different time and spatial dimensions, including the
brain nervous system, the immune system, and the endocrine system. The immune system,
consisting of immune-functioning organs, tissues, cells, immune effector molecules, and
related genes, is a necessary defense mechanism for organisms, especially vertebrates, and
can protect antibodies against the invasion of pathogens, harmful foreign bodies, cancer
cells, and pathogenic factors [13]. The immune function mainly includes immune defense,
immune stability, and immune surveillance. From the perspective of engineering
applications and information processing, biological immune systems provide many
information-processing mechanisms for artificial intelligence. It is the full recognition of
the rich information-processing mechanism in the biological immune system that enabled
Farmer et al. to take the lead in giving a dynamic model of the immune system based on
the immune network theory, discussing the relationship between the immune system and
other artificial intelligence methods, which began the research into artificial immune
system [13].
16 Chapter 1

The artificial immune system is a kind of intelligent method that imitates the natural
immune system. It realizes a learning technology inspired by the biological immune
system and the natural defense mechanism of external substances and provides the essence
of noise tolerance, non-teacher learning, self-organization, and memory. Combined with
some of the advantages of classifiers, neural networks, and machine inference, the artificial
immune system has the potential to provide novel solutions to problems. Its research
results involve many fields such as control, mathematical processing, optimization
learning, and fault diagnosis, etc. It has become another research hotspot of artificial
intelligence following neural networks, fuzzy logic, and evolutionary computation.
Although the artificial immune system has been gradually emphasized by researchers,
compared with the artificial neural networks that have been used in more mature methods
and models, whether it is the understanding of immune mechanisms, the construction of
immune algorithms, or the application of engineering, corresponding research on the
artificial immune system is at a relatively low level.
The research into the artificial immune system mainly focuses on three aspects, namely
research into the artificial immune system model, research into the artificial immune
system algorithm, and application of the artificial immune system. This book focuses on
the research and applications of immune optimization algorithms. Looking at the research
results of the artificial immune system, the immune calculation for the purpose of solving
optimization problems has attracted the attention of many researchers. Representative
research results include the clonal selection algorithm proposed by de Castro et al. [15],
the B-cell algorithm proposed by Timmis et al. [16], the immune network algorithm
proposed by de Castro et al. [17], the vaccine-based immune algorithm [18] proposed by
Jiao et al., and the immune optimization algorithm (opt-IA) [19] proposed by Cutello
et al., and a series of advanced clonal selection algorithms, etc. Many scholars have
generated great interest in these studies and proposed a series of improved algorithms in
succession; furthermore, they have conducted extensive research on the application of
these algorithms.

1.2.4 Other methods

In addition, research into NIC also includes quantum computation (QC) and the complex
adaptive system (CAS), etc.
The study of quantum computing began in 1982. Quantum computing was first seen as a
physical process by Richard Feynman, the Nobel Prize winner in physics and has now
become one of the foremost disciplines closely followed by countries around the world
today. The parallelism, exponential storage capacity, and exponential acceleration features
of quantum computing demonstrate its powerful computational capabilities [20,21]. In
1994, Peter Shor proposed a quantum algorithm for decomposing large prime factors
Introduction 17

which only takes a few minutes to complete the RSA-129 problem (a public key
cryptosystem) that requires 1600 classic computers to complete in 250 days. RSA is a
public key system known to be the safest and cannot be deciphered by classical
computers, but it can be easily deciphered by a quantum computer [22]; in 1996, Grover
proposed a quantum search algorithm that can replace approximately 3.5*1016 steps of a
classical computer with only 200 million steps for deciphering the widely used 56-bit data
encoding standard DES (a type used to protect
pffiffiffiffi interbank and other financial transactions)
to prove that quantum computers are O N faster than classical computers in exhaustive
search problems [23]. At present, quantum computing has been successfully applied in the
fields of secure communication, password systems, and database searches, etc. The United
States developed a prototype of a quantum computer computing as early as 1999.
Computational experts predict that this century will see the emergence and application of a
quantum computer which is 1000 times faster than electronic technology at the solution of
puzzles in the research into quantum computers.
Quantum algorithms are related to classical algorithms, whose most essential features are
the use of the superposition and coherence of quantum states, as well as the entanglement
between quantum bits. It is the product of quantum mechanics in the field of algorithms
and has quantum parallelism which is the most essential difference compared with other
classical algorithms [24, 25]. In the probabilistic algorithm, the system of the state
probability vector is no longer in a fixed state, but is a probability corresponding to each
possible state. If you know the initial state probability vector and the state transition
matrix, you should be able to get the probability vector at any time by multiplying the
state probability vector and the state transition matrix [26]. The quantum algorithm is
similar to this, except that the probability amplitudes of the quantum states need to be
considered
pffiffiffiffi because they are normalized, caused by the fact that the probability amplitude
is N larger than the classical probability. The state transition matrix is changed by
WalsheHadamard, the rotation phase operation, etc. [27].
The complex adaptive system found by Professor Holland, who is researching a complex
system named the Complex Adaptive System (CAS) at the Santa Fe Institute (SFI),
consists of networks of parallel, interacting agents [28, 29]. Such systems include the
human brain, immune system, ecosystems, cells, ant colonies, political parties, and
organizations in human society, etc. The basic idea of a complex adaptive system is that
individuals (elements) are called agents in the system [30] and have their own purpose and
initiative and are active and adaptive. Agents can “learn” and “accumulate experience” in
the ongoing interaction with the environment and other agents so that they can change
their structure and behavior based on learned “experiences.” It is this initiative and the role
among agents, the environment, and other agents, that means they constantly change
themselves, and the environment becomes the basic driving force for system development
and evolution. The evolution of the entire system, including the emergence and
Another random document with
no related content on Scribd:
“Rock?”
“Yes, a sort of narrow ledge across the face; a fault, as they call it.
It runs downward at your left almost to the bottom, I’d say. Listen,
Nod. Suppose I got a long pole and lowered one end to you and held
the other. Would that be easier for you to hold on to?”
Laurie considered a moment. “I reckon so,” he answered. “My right
arm’s just about dislocated. Try it, will you, Bob?”
Bob arose and disappeared into the woods.
“Wish I could stand on my heels for a while,” said Laurie. “My toes
are trying to dance. Where’s Ned gone for the rope?”
“To the quarry, he said,” Polly replied. “If Bob and I made a sort of
rope of our clothes, Laurie, wouldn’t it be better than a pole?”
“Don’t believe so. I wouldn’t feel awfully easy in my mind if I
trusted to that sort of rope. Anyway, I don’t intend to have you make
rags of your new dress!”
“Oh, Laurie, as if a new dress mattered!” exclaimed Polly. “I do
wish it wasn’t so thin, though. Here comes Bob.”
Bob brought the dead trunk of a young black birch about five
inches thick at the butt where, by hacking with his knife and twisting,
he had managed to sever it. Now he slashed the larger branches
away. “Good thing it’s dried out,” he said to Polly. “If it wasn’t it would
be too heavy to hold. Hope it’s long enough!”
“Oh, Bob, I don’t believe it is,” said Polly anxiously.
“If it isn’t I can find one that is.”
But it was. When Bob had lowered the smaller end down the cliff
at Laurie’s right and Laurie had very carefully and rather fearfully
unclasped his numb fingers from their rocky hold and clutched them
about the tree there remained a few inches of the butt end above the
level of the ground. Taking a firm hold with both hands at arm’s
length as he lay facedown, Bob smiled his satisfaction.
“She’ll hold you, Nod, even if the shelf you’re standing on gives
way! Polly can sit on my legs if she has to, and after that I’m good for
all day.”
“Gee, that’s a lot better,” said Laurie. “Wow, that arm was almost
out at the socket! Can you see this fault, as you call it from where
you are?”
“Yes.”
“Look it over, will you? Does it go right to the bottom?”
“N-no, not quite, I guess. I can’t just see the end of it. There’s a
three-cornered hunk of ledge sticking out down there. I guess it
stops about a dozen feet from the bottom, Nod.”
“All right. Tell you what I’m figuring on. You check me up, you two.
Suppose I have that rope that Ned’s gone for. It wouldn’t be any
good for me to try and climb it, for I’m aching all over and I just
wouldn’t have the strength. If I tied it around me you three couldn’t
pull me up over that edge. Of course if the rope’s long enough you
fellows can lower me down, or I could put a turn of the rope around
me and get down myself, I reckon. How about that?”
“You’d get awfully scraped up, I’m afraid,” said Bob. “I’m pretty
sure the three of us can pull you up, Nod.”
“I don’t believe you could. It would be risky, anyway. Maybe,
though, I can climb up somehow.”
“Perhaps,” offered Polly, “Ned will bring some one back with him to
help.”
“Let’s hope so,” said Laurie. “If he doesn’t, the next best thing is a
rope long enough to reach to the bottom. My idea was this, Bob.” He
paused long enough to shift one foot gingerly and relieve his jumping
nerves. “I thought I could tie the end of the rope under my shoulders
and work along this ledge that I’m standing on until I got where I
could jump or drop or something.”
“We could lower you the rest of the way if the rope lasted.”
“Yes, of course. Question is—” Laurie’s words were coming slower
now, with pauses between—“question is, can you folks follow along
the edge and hold your end of the rope?”
Bob turned his head and studied. After a minute he said: “Yes, I’m
sure we can. The trees are close to the edge in places, but we could
manage to pass the rope around them. We’ll see to that. Trouble is,
Nod, there’s a place about ten or twelve yards from where you are
where the blamed shelf sort of peters out for a ways, nearly five feet,
I’d say.”
“That so?” Laurie deliberated. “Well, if you fellows took a turn
around a tree with your end of the rope I reckon I could make it, eh?”
“Yes, I think you could,” Bob agreed. “Sure, you could!”
“All right. Guess that’s ... the best plan,” said Laurie tiredly. “How
long’s Ned ... been gone?”
“Oh, he must be back in a minute!” cried Polly. “He’s been gone a
long, long time.”
“Seen him down there ... yet?”
“He probably went to the office-building near the dock,” answered
Bob. “You can’t see that from here. Keep the old dander up, Nod.”
“I know,” agreed Laurie, “only ... I ain’t so well in my dander! Ought
to see ... a doctor—”
“He’s coming!” cried Polly. “I hear him!”
Even as she spoke joyfully, Ned came into sight, panting,
perspiring, flushed, a coil of rope over a shoulder. He fairly staggered
up the last of the ascent and across the small clearing, his eyes
questioning Polly’s anxiously.
“He’s all right,” cried Polly. Ned exhaled a deep breath of relief and
struggled to disencumber himself of the rope. The girl sprang to his
aid.
“I broke a window in the shed down there,” panted Ned. “This was
all I could find, but it’s good and strong.” He began with trembling
fingers to fashion a noose.
“Oh, Ned,” faltered Polly, “it’s so short!”
“How long?” called Bob.
“Forty feet,” replied Ned. “Maybe more. It’s more than long
enough!”
Polly explained hurriedly, and Ned’s face fell as he stared
despairingly at the cliff’s edge. Then his shoulders went back. “We’ll
get him up,” he said grimly. “We’ll get him up or I’ll go down with
him!” He went on bunglingly with the noose. Bob and Laurie were
talking beyond the edge.
“Rope’s too short for your scheme,” Bob said as cheerfully as he
could. “Only about forty or fifty feet, Nod.”
“Wouldn’t do, eh?” Laurie asked after a moment’s silence.
“No, too short by thirty feet, I guess. Twenty, anyway. We’ll have to
pull you up, old chap. We’ll manage it.”
Ned was peering down now. “I’ve made a slip-noose, Laurie. We’ll
lower it down, and you can get one arm through and then the other.”
“Wait a bit,” said Bob. “You’d better take hold of that ledge again
with your right hand first, Nod. These branches will be in the way.
Can you reach it? Higher yet. There you are! All right.” Bob pulled up
the birch-tree, edged his body back, rolled over, and took several
deep breaths. Then he rubbed his neck vigorously and got to his
knees. “Polly,” he directed, “you take hold of the end of the rope and,
for the love of Mike, don’t let go of it! Lower away now, Nid. Coming
down, old chap. Left arm first. Straighten it up. All right. Get your
hold again. Now the other. Hold the rope closer in, Nid. Right-o! Fine!
Tighten up easy, Nid. How’s that, down there?”
“All right, thanks. Ned, don’t start anything until you’ve rested a bit.
I can hear you puffing down here. I’m fine now and can spend the
day here.”
Ned sank down and relaxed, breathing heavily and mopping his
face. “Best way to do,” said Bob to him, “will be to take a turn of rope
around a tree and let Polly take up the slack as we haul. It’ll be a
hard tug, with the rope binding over the edge, but I guess we can do
it.” Ned nodded, took a deep breath, and stood up.
“Let’s go,” he said shortly.
CHAPTER XIII
THE PEQUOT QUEEN

T he first pull on the rope resulted only in sawing through the turf
and earth at the edge of the cliff until the rock was reached. The
next tug brought a few inches more at the cost of terrific effort, for
the rope must pass at almost right angles over the raw edge of the
rock. Laurie, his hands clasping the rope above his head to lessen
the strain across his chest, was showered with earth. Another heave,
and Ned and Bob went back a scant foot, Polly, her weight on the
rope, tightening the slack around the tree. Once more the two boys
dug their heels into the ground and strained. This time there was no
result. They tried again. It was as though they were pulling at the cliff
itself. The rope tautened under their efforts but yielded not an inch.
“Must be ... caught!” gasped Bob.
Ned, weak from that hurried climb up the hill, nodded, and closed
his eyes dizzily. The moment’s silence was broken by a hail from
Laurie.
“No good, you fellows! The rope’s worked into a crevice of the rock
and is jammed there. I’ll have to climb it myself. Make your end fast
around something and stand by to give me a hand—if I make it!”
Bob silently questioned Ned, and the latter nodded again. “Let him
try,” he said huskily. “If he can’t—”
“Oh, wait, wait!” cried Polly. “We’re—we’re perfect idiots! He
doesn’t have to do that, Ned! He can walk along that ledge, and we
can hold the rope—”
“But it isn’t long enough,” Bob expostulated.
“Not down,” said Polly impatiently; “up!”
“Up? By Jove, that’s so! See what she means, Ned? Here, let’s get
this tied to the tree!” A moment later Bob was at the edge, his eager
gaze following the narrow ledge as it ascended at Laurie’s right.
Scarcely twenty feet beyond, it ended at a perpendicular fissure
hardly four feet below the top. Gleefully he made known the
discovery to Laurie, and the latter, stretched like a trussed fowl
against the rock, his toes still just touching the shelf, grunted.
“Never thought of that,” he said disgustedly. He stretched his head
back until he could see the shelf. Then, “It’s a cinch,” he affirmed.
“You’ll have to get the rope free first, though, and ease up on it until I
can get my feet back on the ledge. Can you do it?”
“Have to,” answered the other cheerfully. Cautiously he and Ned
untied the rope from about the tree, gave it some three inches of
slack, retied it, and set to work at the edge of the cliff. Or, rather, Bob
worked, for Ned’s hands trembled so that he couldn’t. The rope was
fast in a jagged-edged notch of the rock, and Bob’s only implement,
his pocket-knife, was somewhat inadequate. But he made it do.
Using the handle like a tiny hammer, he chipped and chipped until
finally the rope began to slip downward and Laurie’s weight rested
again on the ledge. The end about the tree was unfastened; the rope
was lifted from the channel it had dug through the overlying soil and
carried a yard to the left. Then, with Ned and Bob and Polly holding
it, their heels dug firmly into the sod, Laurie began his journey.
It was slow work at first, for his nerves and muscles responded ill
to the demands of his brain, and delays came when those above
cautiously moved their position, taking new holds on the slowly
shortening rope. Had Laurie been fresh for the task he would have
swarmed up there in no time at all. As it was, it took a good ten
minutes to reach the end of his journey; and, even so, he did not
proceed to the limit of his narrow foot-path but, once his hands could
reach the edge, squirmed his way over, Bob and Ned pulling and
tugging.
Once there, he flopped over on his back in the tangle of brush and
stretched legs and arms relievedly. In the little silence that ensued
Bob removed the rope from Laurie and coiled it with unnecessary
exactitude. Then Laurie took a long, deep breath, sat up, and said
“Thanks!”
That relaxed the general tension. Bob laughed queerly, Ned
grinned in a twisted way, and Polly dabbed at her eyes with a
diminutive handkerchief.
“Welcome,” said Bob dryly. Then all four began to laugh and talk at
the same time. After a moment of that Bob laid a hand on Laurie’s
collar. “Let’s get out of this,” he said. Laurie got to his feet somewhat
shakily, and they fought their way back to the little clearing. “Now,”
said Bob, “we’ll just sit down and look at that view we came up here
to see and get rested for a quarter of an hour. I don’t know how
Laurie feels, but I’m all in!”
“I’ll bet you are,” responded Laurie. “Guess I had the easiest part
of it.”
“You look it,” answered Bob sarcastically. Laurie’s face was brown
with dirt, his knuckles were bleeding, there was a cut on his chin,
and his clothes were torn until they looked fit only for the ragman.
Ned, who had been scowling blackly for the last minute or two, broke
into sudden speech.
“Of all the crazy lunatics, Laurie,” he began fiercely.
“Oh, please, Ned!” cried Polly. “He didn’t mean to do it!”
“Let him say it,” said Laurie humbly. “I deserve it, and it’ll do him
good.”
But Ned’s eloquence had fled him. He said “Humph!” and turned
his head away and stared hard at the wide expanse of scenery
spread before him. The others pretended not to know that there were
tears in his eyes, and Bob said hastily: “Well, all’s swell that ends
swell! How did it happen, anyway, Nod?”
“Oh, I was—was thinking about something and didn’t realize I was
so close to the edge, I guess. Then Ned called to me and I turned
around quick and one foot began to go. I tried to catch hold of that
tree there and missed it. Next thing I knew I was sliding down the
rock. I guess that trying to catch hold of the tree saved me, because
it threw me forward and, instead of falling outward, I went sliding
down with my face scraping against the rock. Somehow, just by luck,
I got hold of a root for a second. It broke off, but it helped, I guess,
for I stopped with my feet on that ledge and my right hand holding on
to something above me. I suppose I made sort of a fuss about it
down there,” he concluded apologetically, “but you don’t know how
quivery your nerves get, Bob. Seemed like my legs wanted to dance
all the time!”
“Son, you certainly had a narrow squeak of it,” said Bob solemnly.
“Gee, when I saw you go over—”
“Oh, it was perfectly horrible,” shuddered Polly. “And then
afterward, while Ned was gone—”
“There’s a busted window down there that some one’s got to settle
for,” growled Ned.
“Believe me, old scout,” replied Laurie feelingly, “I’m willing to
settle for a hundred busted windows! Of course, I don’t mean that it
wouldn’t have been a heap more considerate of you to have slipped
the catch with your knife and saved me the expense.”
Ned faced them again then, glaring at his brother. “You poor fish!”
he said contemptuously.
“That’s me,” agreed Laurie smilingly. “Pulled up with a line!”
Polly and Bob laughed, the former a trifle hysterically. Then Ned’s
mouth twitched itself into a grin. “Laurie, you’re an awful fool,” he
said affectionately.
“Guess you’re right, Neddie.” He climbed to his feet, stamped
them experimentally, seemed to approve of the result, and added,
“Well, unless some one else is going to fall over, say we go home.”
“I’m ready,” agreed Bob. “How about the rope? Oughtn’t we—”
“In payment for my share in the recent—er—episode,” said Laurie,
“I’ll look after it. Where’d you get it, Ned?”
“Why don’t we all go?” asked Polly. “It isn’t much farther that way.”
“Right-o,” agreed Bob. “Besides, who knows what Laurie would do
next if we let him go alone?”
So they set off down the hill again, every one by now extremely
merry and light-hearted in the reaction. They dropped the rope
through the window in the shed adjoining the office of the quarry
company and retraced their steps to the village and up Walnut Street
and so, finally, just as dusk began to settle down, reached the little
shop. There it was Polly who voiced the thought that had been in the
minds of the rest for some time.
“Perhaps,” said Polly, “it would be better if we didn’t say anything
about what happened.”
“Polly,” declared Laurie relievedly—and slangily, “you spoke a
mouthful!”
“Yes,” agreed Ned. “No use worrying folks about a thing when it’s
all over.”
“Of course not,” chimed in Bob. “Guess it won’t happen again,
anyway.”
“Not with me in the rôle of happenee,” said Laurie with conviction.
“If it ever does,” said Ned, “you’ll hang over the cliff until you dry
up and blow away for all of me, you poor simp!”
But when they had said good night to Bob Ned’s tune was
different. “Old-timer,” he said after a silence, “you sure had me
scared.”
“I know,” said Laurie soberly. “Sorry, Ned.”
“Uh-huh. ’S all right.” Ned slipped his arm in Laurie’s. “Wish you’d
cut out that sort of thing, though. Always gives me heart-failure. It’s
risky business, anyway.”
“Right,” agreed Laurie. After a minute, as they passed through the
gate, he added, “No more I’ll risk my neck on dizzy height.”
“Well said, for if you do you’ve me to fight!”
That evening the twins were content to lounge in easy-chairs in
the recreation-room and read, refusing challenges to ping-pong,
chess, and various other engagements requiring exertion of mind or
body. They went early to bed and, although Laurie roused once to
hear Ned in the throes of nightmare and had to quiet him before
returning to his own dreamless slumber, awoke in the morning their
normal selves again.
After breakfast that morning Laurie announced to Ned that he was
going to walk down and explain the broken window, and settle for it if
settlement was demanded. Ned said, “All right, come along.” But
Laurie persuaded the other that his presence during the conference
with the quarry company officials was not only unnecessary but
inadvisable. “You see,” he elaborated, “it’s going to require tact, old
son, and Tact, as you know, is my middle name. Now, if I took you
along you’d be sure to say something to queer the whole show and
I’d have to fork over a dollar, maybe. No, better leave this to me,
Ned.”
“Must say you fancy yourself a bit this morning,” scoffed Ned. “All
right, though. Come over to Bob’s when you get back. I told him I’d
go around there and look at the court.”
Laurie saved his dollar by narrating a moving tale of his fall from
the cliff to the occupants of the small office down by the river. One
weazened little man who held a pen in his mouth and talked through
it or around it—Laurie couldn’t decide which—reminded the visitor
that if he had not trespassed on quarry company property he
wouldn’t have got in trouble. But it was plain that this view was not
popular with the other members of the force present, and Laurie was
permitted to depart with his last week’s allowance intact.
From the office he made his way across toward the stone-walled
dock where lay the Pequot Queen. Once he paused, turned, and
sent his gaze to the great mass of rock that arose precipitately from
beyond the littered floor of the quarry. He couldn’t see the tiny ledge
that had saved his life yesterday, but there, looking very small from
down here, was the leaning tree, and he measured the distance to
the rock-strewn ground beneath and shuddered. He was still gazing
when there was a dull concussion and a cloud of gray dust, and a
great pile of rock slid down the face. The little locomotive tooted and
came rocking toward the railway, dragging a flat-car loaded with two
great squares of rock. On the farther side of the small dock a lighter
was being loaded, a big boom swinging from cars to deck to the
music of a puffing engine and the shrill piping of a whistle. Laurie
continued his way to the Pequot Queen.
A few years before the boat had been used in the ferry service
between Orstead and Hamlin, across the river. Then the business
failed to show a profit, the company was dissolved, and the Pequot
Queen was pushed into the quarry company’s dock—without
permission, if rumor was to be credited—and left to rot. She was
about fifty feet long and very broad of beam. The stern was occupied
by a cabin with many windows, a few of which were still unbroken.
Amidships, if one may apply the term to a launch, was a small
engine-room in which a rusted upright engine still stood amid a litter
of coal-dust. A door led to a smaller compartment, the wheel-house.
Between that and the bow was a space for luggage and freight. The
Pequot Queen had not carried vehicles.
At one time the boat had doubtless shone resplendent in white
paint and gold-leaf. Now there were few traces of either remaining.
The name was still legible on each side of the bow, however, in
faded black. Through the roof a rusty smoke-stack pushed its way to
lean perilously to starboard. Atop the cabin, reached by a narrow
companion, benches inside a pipe-railing had afforded
accommodation for passengers in fine weather. The boat was
secured fore and aft with frayed hawsers, and her rail lay close to the
wall. Laurie viewed her speculatively from stem to stern and then
stepped aboard. Had there been any one about to observe him they
might have thought that here was a possible purchaser, for he went
over the boat completely and exhaustively, giving, however, most of
his time to the cabin. In the end he went ashore and once more
viewed the derelict in frowning speculation. There was no doubt that
the Pequot Queen had outlived her use as a water-craft. She still
floated and would probably continue to float for many years yet, but
old age had claimed her, as rotting timbers and yawning seams
showed. Yet Laurie, whether or not he was a prospective purchaser,
turned away at last with an expression of thoughtful satisfaction on
his countenance.
Back by the railroad, he stopped and viewed his surroundings
intently. On one side lay the bridge, with the Basin beyond and to the
left, and the big quarry to his right. On the other side was the
company office and shed, the dock and pier, the latter piled high with
roughly-squared blocks of stone. Toward town the river’s margin was
unoccupied for a space, and then came the coal-wharves and the
lumber company’s frontage. It was a noisy and dust-laden spot in
which the Pequot Queen had been left to pass her declining years,
and Laurie shook his head slowly as though the realization of the
fact displeased him. Finally he crossed the bridge again, hurrying a
little in order not to compete for passage with a slow-moving freight
from the north, and continued along the river-front until he had
passed the station and the warehouses across the track and was
again allowed a view of the stream unimpeded by buildings. Here
there was no wall along the river, but now and then the remains of an
ancient wooden bulkhead still stood between the dusty road and the
lapping water. Here and there, too, a rotted hulk lay careened or
showed naked ribs above the surface further out. Across the road
hardly more than a lane now, a few dejected but respectable
dwellings stood behind their tiny front yards. Behind them the hill
sloped upward less abruptly than farther back and was thickly
clustered with unpretentious houses wherein the industrious foreign-
born citizens of Orstead lived. Compared to the vicinity of the quarry,
however, this section of town was clean and quiet. There were trees
here, and later on there would be grass along the unfrequented road
and flowers in the little gardens. Westward lay the sunlit river and the
wooded shore beyond. Laurie nodded approvingly more than once
as he dawdled along, paying, as it appeared, special attention to the
margin of the stream. Finally, more than an hour after he had left
school, he retraced his steps as far as Ash Street and turned uphill.
Ash Street was two blocks north of Walnut and, having an easier
grade to climb, was less devious in its journey. It brought Laurie at
length to Summit Street a short block from the little white house from
which Miss Comfort had lately removed. As he passed it Laurie
observed that so far no vandal hand had been laid on it. The brown
shutters were closed at the down-stairs windows, and the buds on
the lilac-bushes were swelling fast. Somehow these two facts,
apparently unrelated, combined to bring a little pang of sadness to
the observer. He went on, with only a glance down Pine Street to the
blue shop, and entered the side gate of the Coventry place.
CHAPTER XIV
A PERFECTLY GORGEOUS IDEA

N ed and Bob were watching Thomas, the man-of-all-work, rolling


the cinder surface of the new tennis-court. Theirs was a
pleasant occupation for such a morning, and Laurie joined them
where they sat on a pile of posts and boards that had once been a
grape-arbor and that had been removed to make way for the court.
“What happened to you?” asked Ned. “Thought maybe they’d had
you arrested. Bob and I were just talking of pooling our resources
and bailing you out.”
“I found I had nearly ninety cents,” said Bob proudly.
“No, they were all right about it,” replied Laurie musingly. Then he
lapsed into silence, staring thoughtfully at Thomas as he paced to
and fro behind the stone roller.
“What do you think of it?” asked Bob, nodding at the court.
“Corking. Pretty nearly done, isn’t it?”
“Pretty nearly. It’ll take about two days to put the gravel on.
They’re going to bring the first load this afternoon. It has to have clay
mixed with it, you know, and that makes it slower. And then it’s got to
be rolled well—”
“Seems to me,” said Laurie, “a turf court would have been easier.”
“Yes, but they don’t last. You know that. And it’s the very dickens
to get a grass surface level.”
Laurie nodded. It was evident to Ned, who had been watching him
closely, that Laurie’s mind was not on the tennis-court. “What’s
eating you, partner?” he asked finally. Laurie started.
“Me? Nothing. That is, I’ve been thinking.”
“Don’t,” begged Ned. “You know what it did to you yesterday.”
“I want you and Bob to be at Polly’s this afternoon when she gets
home from school. I’ve got something to tell you.”
“Tell us now,” suggested Bob. Laurie shook his head.
“No use saying it twice.”
“What’s it about?” asked Ned.
“About—about Miss Comfort.”
“Gee,” said Bob, “I thought that was done with. What about her,
Nod?” But Laurie shook his head, and their pleas for enlightenment
were vain.
“You’ll know all about it this afternoon,” he said. “So shut up.” A
minute after he asked, “Say, Bob, does your father know the folks
who run that quarry?”
“Yes, I guess so. He buys stone from them. Why?”
“I want to meet the head guy, president or general manager or
whatever he calls himself. That’s all.”
“Want to meet him! What for? Going to get after him for not having
a railing around the top of the bluff?”
“Not exactly. Know any one here who has a launch?”
“Lunch? Say, what are you talking about?”
“I didn’t say lunch, you goop; I said launch, l-a-u—”
“Oh, launch! Why, no, I don’t believe so. I know a fellow who owns
a canoe—”
“Sure,” agreed Laurie with deep sarcasm, “and I know a fellow
who owns a bean-shooter, but it doesn’t interest me. There must be
some one who has a launch around here. There are half a dozen on
the river.”
“Why, there’s a man down there who rents boats, you idiot. I think
he has some sort of a launch. I thought you meant—”
“What’s his name? Where’s he live?”
“Name’s Wilkins or Watkins or something, and he lives—I don’t
know where he lives, but he keeps his boats up by the old chain-
works.”
“Thanks. You fellows going to spend the day here? Let’s do
something.”
“Want some tennis?” asked Bob eagerly. “I’ll take on you and Nid.”
Laurie looked inquiringly at his brother. “Would you?” he asked.
“Seems sort of too bad to take advantage of his ignorance.”
“It’ll teach him a lesson,” answered Ned, rising, stretching, and
looking commiseratingly down at the challenger. “Pride goeth before
a fall and a haughty spirit—”
“Before the Turners,” completed Laurie. “Come on to the slaughter,
Bob, before my heart softens and I let you off.”
Shortly after three that afternoon, Laurie, perched on a counter in
the Widow Deane’s shop, had the floor. That sounds peculiar, I
acknowledge, but you know what I mean. They were in the shop
because Mrs. Deane and Miss Comfort were occupying the back—
pardon me, the garden. “It’s like this,” Laurie was telling Polly, Mae,
Ned, and Bob. “We couldn’t find a place on land for Miss Comfort,
and so it occurred to me that a place on the water might do.” He
paused to enjoy the effect of this strange announcement.
“On the water!” echoed Polly. “Why, whatever do you mean?”
“Yes,” cried Mae, “whatever—”
“Don’t you get it?” asked Ned. “He wants Miss Comfort to join the
navy!”
Laurie grinned. “Shut up, you idiot! You know the Pequot Queen?”
They all agreed silently that they did. “Well, I’ve been all over the
boat this morning. It would take about two or three days—and a few
dollars, of course—to make her into just as nice a house as any one
would want. Take that cabin—”
“But, look here, you three-ply goop,” interrupted Ned, “Miss
Comfort wouldn’t want to live on a tumble-down old ferry-boat!”
“How do you know?” asked Laurie. “Have you asked her?”
“But—but she’d be afraid, Laurie,” protested Polly. “I’m sure I
should! Suppose it floated away or—or sank—”
“Suppose it spread its wings and flew on top of the court-house,”
answered Laurie sarcastically. “It couldn’t float away because it
would be moored to the bank, and it couldn’t sink because there
wouldn’t be enough water under it. Now, just listen a minute until I
get through. Of course I know that the scheme sounds funny to you
folks because you haven’t any imagination. As for saying that Miss
Comfort wouldn’t live in the Pequot Queen, you don’t know anything
of the sort. I’m blamed certain that if I was—were Miss Comfort I’d a
lot rather live in a nice clean boat tied to the bank than go to the
poor-farm!”
“Well,” said Polly dubiously, “you’re a man.”
“A man!” jeered Ned.
“Well, you know perfectly well what I mean,” said Polly. It was
evident that Polly wanted very much to be convinced of the
practicability of the plan, and her objection had been almost
apologetic. Mae, taking her cue from her friend, awaited further
enlightenment in pretty perplexity.
“Miss Comfort has enough to furnish it with,” continued Laurie. “At
least, Polly said she had taken a lot of stuff with her.” Polly nodded
vigorously. “All we’d have to do would be to board up about four
windows on each side of the cabin, put some shades or curtains at
the others, put a new lock on the door, run a stove-pipe through the
roof—”
“Perfectly simple and easy,” said Ned. “Go on, son.”
“That’s about all. That cabin’s big enough for her to live in
comfortably, big enough for a stove and bed and table and chairs—
and—and everything. Then, there’s the roof, too. Why, she could
have a roof-garden up there, and a place to dry her clothes—”
“After she’s fallen overboard?” asked Bob.
“That’s all right,” answered Laurie a trifle warmly. “Have your fun,
but the scheme’s all right, and if you’d quit spoofing and stop to think
seriously a minute—”
“Why, I think it’s a perfectly splendid idea!” asserted Polly with a
bewildering change of front.
“Gorgeous!” chimed in Mae.
“If only Miss Comfort can be persuaded to try a life on the ocean
wave,” added Ned dryly. “Seems to me the first thing to do is to ask
her what she thinks of it.”
“No, it isn’t,” said Laurie. “The first thing is for you to go down there
with me right now and see for yourselves. If you don’t agree with me
we’ll just let it drop.”
“Of course,” said Polly. “Come on, every one! Oh, I do hope that
Miss Comfort will like it!”
“How about the owners?” asked Bob as, a minute later, they were
all on the way to the river. “Well, not the owners, for I suppose there
aren’t any. But what about the quarry people, Nod? Think they’ll let
us have it?”

They all accompanied Laurie to the Pequot Queen


“Don’t see why not. It’s no good to them, and it’s in their way.
That’s where your father comes in, Bob. I want him to introduce us to
the head guy and say a good word. Think he’d mind?”
“No, but even if Miss Comfort lived in the boat, Nod, it would be
just as much in the way, wouldn’t it?” Bob looked puzzled.
“No, because it wouldn’t be there any longer. We’d have it hauled
out of their dock and taken to a place I found the other side of town,
up-river. Know where Ash Street comes out down there? Well, about
two blocks beyond that. We’d draw the boat up close to the bank,
make her fast, and build a sort of bridge to the deck. Some of that
stuff in your yard will come in very handy.”
“Why, that would be perfect!” declared Polly. “I didn’t want to
mention it, Laurie, but I was dreadfully afraid that Miss Comfort
wouldn’t want to live down there by the quarry, with the dynamite
shooting off and all those rough-looking men about!”
“Sounds as if the young fellow’s scheme might have something in
it after all,” allowed Ned. “Just the same, I’ll bet the quarry folks won’t
give up the boat unless some one pays them for storage or whatever
it’s called.”
“I’m not so sure,” said Bob. “Dad’s company is a pretty good
customer just now, and if dad will talk with the head of the firm—”
“He might tell them that he wouldn’t buy any more of their old
stone,” said Mae. “I guess that would—would bring them around!”
“Not a doubt of it,” laughed Ned. “Well, let’s have a good look at
the old ship first. Maybe she’s fallen to pieces since morning!”
But she hadn’t. They spent a full twenty minutes aboard her, while
Laurie explained and Polly’s enthusiasm grew by leaps and bounds.
Bob, too, came over to Laurie’s side, and even Ned, although he still
pretended to doubt, was secretly favorable. As for Mae—well, as
Polly went so went Mae! After they had viewed and discussed the
Pequot Queen to their satisfaction, Laurie led them back along the
river and showed the place he had selected for the Pequot Queen’s
future moorings. It was a quiet spot, disturbed by scant traffic along
the lane, now that the chain-works was no longer in operation.
Passing steamers and tugs might infrequently break the silence with
their whistles, and when, further down, a coal-barge tied up at the
wharf, the whir of the unloading machinery would come softened by
distance. Between the well-nigh unused road and the water lay a
strip of grass and weeds, a ribbon of rushes, a narrow pebbled
beach. Some sixty feet out a sunken canal-boat exposed her deck-
house above the surface. Six yards or so from the tiny beach the
remains of a wooden bulkhead stretched. In places the piles alone
remained, but opposite where Laurie had halted his companions
there was a twelve-foot stretch of planking still spiked to the piles.
“We could bring her up to that bulkhead and make her fast to the
piles at bow and stern. I figure that there’s just about enough water
there to float her. Then we’d built a sort of bridge or gangway from
the bulkhead to the shore. She couldn’t get away, and she couldn’t
sink. That old hulk out beyond would act as a sort of breakwater if
there was a storm, too.”
“I think it’s a perfectly gorgeous idea,” said Polly ecstatically. “And
just see, Mae, how very, very quiet and respectable it is here!”
Ned, though, seemed bent on enacting the rôle of Mr. Spoilsport.
“That’s all right,” he said, “but how are you going to get permission to
tie her up here? This property belongs to some one, doesn’t it?”
Laurie looked taken aback. “Why, I don’t believe so, Ned. Here’s
the road and here’s the river. There’s only a few feet—”
“Just the same,” Ned persisted, “some one’s bound to own as far
as high tide.”
“Maybe the folks in the house across the road,” suggested Mae.
“Mean to tell me,” demanded Laurie, “that the fellow who left that
canal-boat out there had to ask permission?”
“That’s in deep water,” answered Ned.
“So would the Pequot Queen be in deep water!”
“Maybe, but your bridge or gangplank wouldn’t be.”

You might also like