Cerebellar Model Articulation Controller

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Cerebellar model articulation controller

The cerebellar model arithmetic computer


(CMAC) is a type of neural network based on a
model of the mammalian cerebellum. It is also
known as the cerebellar model articulation
controller. It is a type of associative memory.[2]

The CMAC was first proposed as a function


modeler for robotic controllers by James Albus in
1975[1] (hence the name), but has been extensively
used in reinforcement learning and also as for
automated classification in the machine learning
community. The CMAC is an extension of the
perceptron model. It computes a function for
input dimensions. The input space is divided up into A block diagram of the CMAC system for a single
hyper-rectangles, each of which is associated with a joint. The vector S is presented as input to all joints.
memory cell. The contents of the memory cells are Each joint separately computes an S -> A* mapping
the weights, which are adjusted during training. and a joint actuator signal pi. The adjustable weights
Usually, more than one quantisation of input space for all joints may reside in the same physical
is used, so that any point in input space is associated memory.[1]
with a number of hyper-rectangles, and therefore
with a number of memory cells. The output of a
CMAC is the algebraic sum of the weights in all the memory cells activated by the input point.

A change of value of the input point results in a change in the set of activated hyper-rectangles, and
therefore a change in the set of memory cells participating in the CMAC output. The CMAC output is
therefore stored in a distributed fashion, such that the output corresponding to any point in input space is
derived from the value stored in a number of memory cells (hence the name associative memory). This
provides generalisation.

Building blocks
In the adjacent image, there are two inputs to the CMAC,
represented as a 2D space. Two quantising functions have been
used to divide this space with two overlapping grids (one shown in
heavier lines). A single input is shown near the middle, and this has
activated two memory cells, corresponding to the shaded area. If
another point occurs close to the one shown, it will share some of
the same memory cells, providing generalisation.

The CMAC is trained by presenting pairs of input points and output


values, and adjusting the weights in the activated cells by a
proportion of the error observed at the output. This simple training
algorithm has a proof of convergence.[3]
CMAC, represented as a 2D space
It is normal to add a kernel function to the hyper-rectangle, so that points falling towards the edge of a
hyper-rectangle have a smaller activation than those falling near the centre.[4]

One of the major problems cited in practical use of CMAC is the memory size required, which is directly
related to the number of cells used. This is usually ameliorated by using a hash function, and only providing
memory storage for the actual cells that are activated by inputs.

One-step convergent algorithm


Initially least mean square (LMS) method is employed to update the weights of CMAC. The convergence
of using LMS for training CMAC is sensitive to the learning rate and could lead to divergence. In 2004,[5]
a recursive least squares (RLS) algorithm was introduced to train CMAC online. It does not need to tune a
learning rate. Its convergence has been proved theoretically and can be guaranteed to converge in one step.
The computational complexity of this RLS algorithm is O(N3).

Hardware
implementat
infrastructur

[6]
Parallel pipeline structure of CMAC neural network

Based on QR decomposition, an algorithm


(QRLS) has been further simplified to have
an O(N) complexity. Consequently, this
reduces memory usage and time cost
significantly. A parallel pipeline array
structure on implementing this algorithm has
been introduced.[6]

Overall by utilizing QRLS algorithm, the


CMAC neural network convergence can be
guaranteed, and the weights of the nodes can
be updated using one step of training. Its

Left panel: real functions; right panel: CMAC approximation


with derivatives
parallel pipeline array structure offers its great potential to be implemented in hardware for large-scale
industry usage.

Continuous CMAC
Since the rectangular shape of CMAC receptive field functions produce discontinuous staircase function
approximation, by integrating CMAC with B-splines functions, continuous CMAC offers the capability of
obtaining any order of derivatives of the approximate functions.

Deep CMAC
In recent years, numerous studies have confirmed that by stacking several shallow structures into a single
deep structure, the overall system could achieve better data representation, and, thus, more effectively deal
with nonlinear and high complexity tasks. In 2018,[7] a deep CMAC (DCMAC) framework was proposed
and a backpropagation algorithm was derived to estimate the DCMAC parameters. Experimental results of
an adaptive noise cancellation task showed that the proposed DCMAC can achieve better noise
cancellation performance when compared with that from the conventional single-layer CMAC.

Summary
Scalability Straightforward to extend to millions of neurons or further
Convergence The training can always converge in one step

Function derivatives Straightforward to obtain by employing B-splines interpolation

Hardware structure Parallel pipeline structure


Memory usage Linear with respect to the number of neurons

Computational complexity O(N)

See also
Artificial neural network
Recursive least squares filter
Deep learning

References
1. J.S. Albus (1975). "A New Approach to Manipulator Control: the Cerebellar Model
Articulation Controller (CMAC)". In: Trans. ASME, Series G. Journal of Dynamic Systems,
Measurement and Control, Vol. 97, pp. 220–233, 1975.
2. J.S. Albus (1979). "Mechanisms of Planning and Problem Solving in the Brain (https://citese
erx.ist.psu.edu/viewdoc/download?doi=10.1.1.141.3795&rep=rep1&type=pdf)". In:
Mathematical Biosciences. Vol. 45, pp. 247293, 1979.
3. Y. Wong, CMAC Learning is Governed by a Single Parameter, IEEE International
Conference on Neural Networks, San Francisco, Vol. 1, pp. 1439–43, 1993.
4. P.C.E. An, W.T. Miller, and P.C. Parks, Design Improvements in Associative Memories for
Cerebellar Model Articulation Controllers, Proc. ICANN, pp. 1207–10, 1991.
5. Ting Qin, et al. "A learning algorithm of CMAC based on RLS." Neural Processing Letters
19.1 (2004): 49-61.
6. Ting Qin, et al. "Continuous CMAC-QRLS and its systolic array." Neural Processing Letters
22.1 (2005): 1-16.
7. * Yu Tsao, et al. "Adaptive Noise Cancellation Using Deep Cerebellar Model Articulation
Controller." IEEE Access Vol. 6, pp. 37395 - 37402, 2018.

Further reading
Albus, J.S. (1971). "Theory of Cerebellar Function (https://web.archive.org/web/2004102910
5057/http://www.isd.mel.nist.gov/documents/albus/Loc_01.pdf)". In: Mathematical
Biosciences, Volume 10, Numbers 1/2, February 1971, pgs. 25–61
Albus, J.S. (1975). "New Approach to Manipulator Control: The Cerebellar Model
Articulation Controller (CMAC) (https://web.archive.org/web/20041029165921/http://www.is
d.mel.nist.gov/documents/albus/Loc_04.pdf)". In: Transactions of the ASME Journal of
Dynamic Systems, Measurement, and Control, September 1975, pgs. 220 – 227
Albus, J.S. (1979). "Mechanisms of Planning and Problem Solving in the Brain (https://web.a
rchive.org/web/20100527160838/http://www.isd.mel.nist.gov/documents/albus/Loc_5.pdf)".
In: Mathematical Biosciences 45, pgs 247–293, 1979.
Iwan, L., and Stengel, R., "The Application of Neural Networks to Fuel Processors for Fuel
Cells (https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=917898)" In IEEE
Transactions on Vehicular Technology, Vol. 50 (1), pp. 125-143, 2001.
Tsao, Y. (2018). "Adaptive Noise Cancellation Using Deep Cerebellar Model Articulation
Controller (https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8341851)". In: IEEE
Access 6, April 2018, pgs 37395-37402.

External links
Blog on Cerebellar Model Articulation Controller (CMAC) (https://widecmac.wordpress.com/)
by Ting Qin. More details on the one-step convergent algorithm, code development, etc.

Retrieved from "https://en.wikipedia.org/w/index.php?


title=Cerebellar_model_articulation_controller&oldid=1153873521"

You might also like