Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Scientific Programming 12 (2004) 81–90 81

IOS Press

Parallel computing applications and financial


modelling
Heather M. Liddell, D. Parkinson, G.S. Hodgson and P. Dzwig
Department of Computer Science, Queen Mary, University of London, Mile End Road, London E1 4NS, UK
E-mail: heather@dcs.qmul.ac.uk

Abstract. At Queen Mary, University of London, we have over twenty years of experience in Parallel Computing Applications,
mostly on “massively parallel systems”, such as the Distributed Array Processors (DAPs).
The applications in which we were involved included design of numerical subroutine libraries, Finite Element software, graphics
tools, the physics of organic materials, medical imaging, computer vision and more recently, Financial modelling. Two of the
projects related to the latter are described in this paper, namely Portfolio Optimisation and Financial Risk Assessment.

Keywords: Parallel computing, massively parallel systems, distributed array processor, financial modelling, portfolio optimisation,
risk assessment

1. Introduction The use of massively parallel systems required the


development of new algorithms, software tools and ap-
The UK has a long history of leading the world plication techniques; a reliable migration path must be
in innovative computing – this is particularly true in provided from current systems – this is particularly im-
High Performance Computing and Communications portant in the commercial world. These are the chal-
(HPCC). Within this area there has been growing em- lenging areas we have been addressing in our research.
phasis on the use of “massively parallel systems”, com-
prising many thousands of processors working together.
Two systems capable of massive parallelism are the 2. Parallel computing applications on the
Distributed Array Processor (DAP) and the transputer; distributed array processor
both were invented in the UK. The DAP activity at
QMW formed a nucleus for parallel processing, from The work we are going to describe is the result of
which evolved a number of research groups and spe- over twenty years of research activity at Queen Mary.
cialist centres in the UK ; more importantly, a large First the DAP Support Unit (DAPSU) was formed then
community of users was stimulated to use parallel com- the Centre for Parallel Computing, a unit established
puters for a wide variety of problems in Science, Engi- by Queen Mary in 1987. We were later joined by Prof.
neering and other disciplines. High Performance Com- Yakup Paker and his group. Finally the London Paral-
puting (and hence massively parallel processing) is of lel Application Centre (LPAC), a consortium compris-
strategic national importance, both as an industry in ing Queen Mary, Imperial College, University College
its own right and also because of the requirements of London and City University was established. Our work
other industries. Senator Gore, the former US Vice has always been interdisciplinary – we worked with en-
president stated as long ago as 1989 that “the nation gineers, scientists and social scientists. We were also
which most completely assimilates high-performance involved in many collaborative projects with other in-
computing into its economy will emerge as the domi- stitutions in the UK, in Europe and on a wider interna-
nant intellectual, economic and technological force in tional scene. An important aspect of our research con-
the next (i.e. this!) century”. cerns “technology transfer” to industry and commerce.

ISSN 1058-9244/04/$17.00  2004 – IOS Press and the authors. All rights reserved
82 H.M. Liddell et al. / Parallel computing applications and financial modelling

Our first “massively” parallel machine was the ICL use of neural networks and visualisation, but consid-
4096 processor DAP [1], funded by the UK Computer erable emphasis was placed on the commercial and fi-
Board, which arrived in 1980. This marked the be- nancial applications, partly because of our geographi-
ginning of 12 continuous years of a national DAP ser- cal proximity to the City of London. These were new
vice. The original ICL DAP was replaced by another fields for academic computing people, but we were for-
4096 processor machine, the AMT (Active Memory tunate in being able to collaborate with colleagues hav-
Technology) DAP 610, in 1988. A second generation ing expertise in these areas. The rest of this paper will
ICL 1024 processor “mini-DAP” formed part of a grant concentrate on that aspect of our work.
from the SERC (Science and Engineering Research
Council), the forerunner of EPSRC, the Engineering
and Physical Sciences Research Council. One essential 3. Application of parallel and distributed
requirement for the growth of use and acceptance of computing to portfolio optimisation
parallel computing technology is the provision of effi-
cient system software, portable software tools and ap- The financial markets form one of the most important
plications software. Much of our application software engines of industry. This project was particularly inter-
took the form of specialist program libraries and tools. ested in understanding and improving the methodology
Our approach has been to attempt to provide libraries and science behind the creation and maintenance of
that have a similar user interface and documentation large portfolios of financial instruments (bonds, shares
style to that available in the NAg library, but which are etc) through the deployment of large scale numerical
specifically designed for parallel machines. A major computing techniques.
product was the DAP Subroutine library [2], marketed Portfolios are collections of financial instruments
by AMT. The DAP Finite Element Library [3] was pro- whose overall intention is to, for example, minimise the
duced for engineering applications and demonstration risk caused by exposure to market fluctuations. From
programs were also developed. Various members of our perspective they are a collection of instruments gov-
our Centre were involved in the design, implementa- erned by non-linear dynamics that reflect market dy-
tion and validation of data-parallel languages – For- namics. The requirement for the financial engineer is
tran Plus, C++, AI/Logic languages, such as Lisp and to select a portfolio in an optimal way. In reality such
Prolog, and the functional language HASKELL. solutions are, of course, sub-optimal.
The focus of our activity has always been towards The modelling of classical portfolios and under-
users of parallel systems. We have been actively en- standing their dynamics has a long history, the land-
gaged in many different application areas – engineer- mark of which was Markowitz’s paper [7]. This in-
ing, physical sciences, computer vision, medical imag- volves creating a model in the form of a stochastic dif-
ing and econometrics. One example was an Esprit ferential equation, whose stochastic terms are reflected
Basic Research project, OLDS [4], in which we were in volatility. This volatility is conventionally taken as
working with colleagues in Physics and Chemistry at a constant. In prosaic terms an investor makes an esti-
QMW and with European partners,exploring the poten- mate of what the growth of a particular component may
tial use of organic materials in novel micro-electronics be and this estimate becomes a parameter of the calcu-
devices. Another was a collaborative medical imag- lation. The solution to the optimisation problem can
ing project, MIRIAD, in which we investigated meth- then be found by deploying conventional optimisation
ods for the display and segmentation of features in 3D techniques. Naturally the growth is neither constant nor
images [5,6]. easily determined, although there may be good reason
LPAC’s major objective was technology transfer, pri- for believing that it will fall within certain bounds.
marily via collaborative projects carried out with In- Much work has been done internationally on the
dustrial /Commercial partners and with applications ex- modelling of individual instruments under varying con-
perts in our own establishments, to produce useable ap- ditions. In comparison little has been done on the op-
plications software, portable to many different parallel timisation of medium to large-scale portfolios with the
computing systems, and to provide migration paths to use of high-density historical data sets to drive the in-
these systems. We considered the European dimension puts to the models. In all there may be several tens
to be particularly important. We continued to address of such instruments in a portfolio (which is itself an
the more traditional industrial and scientific computa- instrument). The aim is then to model the behaviour of
tional problem areas, including industrial modelling, this problem that has perhaps fifty dimensions. In our
H.M. Liddell et al. / Parallel computing applications and financial modelling 83

research we set out to apply the techniques of high per- tracted by Principal Component Analysis. This infor-
formance computing, in which the investigators have a mation can be fed back into the model to improve the
great deal of experience, to the problem of modelling results. In this paper, some discussion on the stability
such portfolios. of the results is included.
Much has also been written in recent years concern- Rather than input a constant value for the expected
ing large systems with deterministic constraints. This rate of return of each input scenario, we allow a range of
has proved to be a fruitful field for the application of values with a generalised Wiener process being used to
large scale, and in particular, parallel computing. Con- simulate each input scenario. Generating these Wiener
ventional optimisation with non-stochastic inputs can processes forms the first stage of a four stage process
normally be completed within reasonable run-times on for each market, and may be summarised as follows:
PCs or similar, giving flexibility when building indi- Stage 1 For each simulation, generate a Wiener pro-
vidual models to respond to client requests. With that cess for each market scenario.
approach typical PC systems permit handling of mod- Stage 2 Solve the resulting optimisation problem
erate sized portfolios in the region of, say, two hundred at points on the “efficient frontier” [9] for
instruments. The introduction of the stochastic inputs each simulation portfolio) and apply Princi-
and the associated Monte Carlo simulations makes the pal Component Analysis to the simulation
problem one in which issues such as parallel compu- portfolios.
tation come to play a conspicuous role. Hence the Stage 3 For those points on the efficient frontier,
problem demands the application of high performance calculate an averaged portfolio over all the
computing [8]. simulations (we call this the ‘mean’ portfo-
In our research we investigated the numerical deter- lio).
mination of portfolios with non-stochastic constraints Stage 4 Calculate mean and volatility of rates of re-
combined with non-deterministic inputs, and also con- turn for simulated scenarios and solve again
sidered the stability of the resulting portfolios in a to obtain an optimum market portfolio (we
model which has been developed by the author and her call this the ‘efficient’ portfolio). In this
co-workers. This required the application of parallel way a mean portfolio and an efficient port-
computing [9,10]. folio is obtained for each market. We permit
Normally it is assumed that the portfolio elements instruments to be grouped into a number of
will behave as “Wiener processes” and have a predicted separate markets, for which historical cor-
performance which is completely deterministic; that relation information between the securities
is, a particular share or bond will have a return of of each such market is available. An overall
x%. In our model we allowed not only for multivariate solution can then be obtained
distributions but also for returns which have a range of Stage 5 A further optimisation stage (across mar-
values, whose size and characteristics are determined kets), optimised by constraining the propor-
purely by empirical historical data. tions of each market in the overall portfolio
and by using correlation information gener-
3.1. Outline of the problem ated by the Monte-Carlo simulations across
the markets
The relative behaviour of securities is defined by Securities are considered as forming a single market
their correlation information. Traditionally a single set if correlation information is available across that set of
of inputs for growth expectations of individual portfo- securities (see section 2.2). For each market, scenarios
lio components is used in a single optimisation solu- defining the anticipated performance of the market are
tion. We generalised this to allow stochastic inputs to specified by the user; each scenario directly influences
represent the distribution of possible outcomes for the the performance of one (and only one) sector of the
returns generated by individual portfolio components. market, although historical correlation information is
These ‘scenarios’ are described in terms of specified required for the whole market (so sectors need not be
rates of return and standard deviation and the statistical independent).
behaviour of these inputs can be selected from a set of During our research we investigated the numerical
standard statistical distributions. determination of portfolios with non-stochastic con-
The results of the many ensuing Monte-Carlo simu- straints combined with non-deterministic inputs, and
lations contain much useful information that can be ex- also the stability of the resulting portfolios.
84 H.M. Liddell et al. / Parallel computing applications and financial modelling

3.2. The Application of HPC ified by the user. The stochastic values calculated for
each index scenario are compared with the historic val-
We have seen that the basic inputs to the model are ues calculated from the expected rates of return and
securities which are grouped into markets – for each standard deviations supplied for individual securities
market there are a number of scenarios which define (in the relevant sector if related scenarios are provided
the anticipated performance of that particular market. for a market). The ratios so obtained are used to scale
Each market is conceived as consisting of a number the historic rates of return and standard deviations of
of sectors; each sector may be influenced by different individual securities and optimum portfolios are then
constraints. Hence the user can exert fine control over determined using the scaled values.
how each market is expected to perform. Individual For example, consider a market where the given sce-
markets are processed separately. An overall solution narios are the securities themselves. The (historic) cor-
is obtained by applying the optimisation to the markets relation matrix of the market securities is also the cor-
themselves. relation matrix for the scenarios. The covariance ma-
Optimal portfolios calculated for each market are trix is formed from the correlation matrix by scaling
influenced by sets of constraints, which may be simple each row and column by the standard deviation of each
bounds or linear or non-linear constraints. The various security. We calculate a set of scaling factors for the
types of constraint permit different optimum portfolios first simulation as follows:
to be chosen on the efficient frontier; for example, a (i) each Wiener process for a scenario/security
portfolio with minimum risk, or one with maximum price S over time t is of the form
return, or one with some intermediate choice of risk or √
∆S
return. Because we are performing many Monte Carlo = µ∆t + σε ∆t,
simulations, we generate sets of optimal portfolios and S
sets of efficient frontiers. This technique has allowed us where µ is the expected growth rate of the se-
to predict portfolio behaviour that performs extremely curity per unit time, σ is the standard deviation
well under most cases. of the security price and ε is a random number
We have deployed a variant of PCA analysis to en- from a multivariate normal distribution (since
able us to identify the behaviour of the portfolios under the scenarios are related).
all the scenarios generated. In particular we have been (ii) Having generated all the Wiener processes for
able to use the technique to examine the components the simulation, we can calculate the perturbed
of the portfolios and their variation as the market be- values (growth rate and standard deviation)
haviour changes. This has enabled us to identify var- which result for each scenario.
ious categories of portfolio component and to observe (iii) We then scale the historic correlation matrix by
the role that they play in determining the behaviour of these standard deviations to produce the per-
the portfolio as a whole. turbed covariance matrix for the simulation and
The models under investigation can include differing solve the appropriate optimisation problems.
types of securities including shares, bonds and options (iv) This process is then repeated for each simula-
in (potentially geographically) diverse markets. These tion.
are defined by their correlation information. Tradition- A general-purpose optimisation routine is used to
ally a single set of inputs for growth expectations of in- determine an optimum portfolio, based upon the SQP
dividual portfolio components is used in a single opti- (Sequential Quadratic Programming) method [12]. For
misation solution. We generalise this to allow stochas- each market, constraints may be supplied in three
tic inputs to represent the distribution of possible out- forms: simple bounds, linear constraints of the form
comes for the returns generated by individual portfolio Ax  Bl and Ax  Bu , and smooth non-linear con-
components. straints (defined by a vector of constraints or optionally
its Jacobian). (A is an m ∗ n matrix of coefficients,
3.3. Monte carlo simulations X the sum of the proportions of each security, B us
and Bl represent the upper and lower bounds of the
A choice of statistical distribution for the simulated inequalities, respectively).
scenarios is provided. Each scenario in a simulation Constraints will often need to be included to produce
run is considered as a generalised Wiener process; the a balanced portfolio across the market. Without some
number of discrete time steps of each scenario is spec- constraints on the proportion of certain securities, the
H.M. Liddell et al. / Parallel computing applications and financial modelling 85

portfolio can become dominated by a very few securi- approach is that the communication costs can become
ties (especially for the case of maximum return). Con- significant if message passing is too frequent.
straints of this form can result in discontinuities of the Our application is ideally suited to the PVM ap-
efficient frontier. It is also usual to constrain the sum proach. A master process controls the generation of
of the proportions
 of each security in the portfolio to the Wiener processes and the perturbed scenario in-
one, i.e. j Xj = 1. puts, it then spawns the appropriate number of slave
For each simulation run, five types of optimisation processes. Each slave process is responsible for per-
can be solved: the optimisations for minimum risk and forming a number of simulations. The master process
for maximum return (which represent the end points of passes the perturbed scenario inputs and other historic
the efficient frontier); optimisations of intermediate risk data necessary for the optimisations to each slave pro-
and intermediate return; or other intermediate points cess (only one message per slave process is needed for
on the efficient frontier
 defined
 using the parameterised
 this). It then waits for the slaves to complete their
objective function i j Xi Xj Cij − λ( j Xj Ej ). computations and pass back the simulation portfolios
This Monte Carlo simulation is repeated a specified (again only one message per slave is needed for this).
number of times. Sufficient simulation runs should be The master process can then continue and analyse the
performed to ensure the stochastic results have con- results. Thus frequent message passing is avoided and
verged. Variance reduction techniques are used to im- synchronisation is kept to a minimum.
prove convergence. A further set of optimisations is Whilst originally we believed that parallel comput-
then solved using the observed means and standard de- ing would provide the only way forward for these cal-
viations; analogous results are also calculated using the culations, recent development in PC-level architectures
user-supplied values. Principal Component Analysis is and performance level has meant that large-scale high
then applied to the recorded portfolios. More details power PCs can be coupled to offer an effective solution
are given in [10]. for some classes of problems.
Average computation times for each time-point of
3.4. Parallelisation of the system the set of tests described below (2000 simulations over
59 securities for one ‘point’ on the efficient frontier, but
There are two ways in which we can take advantage excluding the PCA analysis) were reduced from 13.2
of the parallel aspects of this system. minutes for the single process solution to 9.1 minutes
First, the numerical algorithms used have substantial on a twin processor Intel PC. This is a significant sav-
vector and matrix operations inherent in the linear al- ing bearing in mind that the Wiener process initialisa-
gebra of the calculations. By utilising computationally tion time is not dependent on the number of points on
efficient BLAS (Basic Linear Algebra Subroutines) in the efficient frontier, hence multiple solutions on the
the numerical algorithms, we can ensure advantage is efficient frontier will show more favourable savings in
taken of the architecture of the hardware platform used computational costs for solution.
(for example, matrix operations use optimum cache
sizes and vector operations use pipelining efficiently). 3.5. Principal component analysis
The effect of using such machine specific BLAS on the
system is an approximate three-fold increase in com- We want to develop tools for understanding the com-
putational speed on Intel PCs. position and stability of portfolios under various as-
Second, the Monte-Carlo simulations are indepen- sumptions about how errors in risk and return are mea-
dent of one another and therefore can be executed in sured.
parallel without synchronisation. Currently available Principal components are defined as linear combina-
library software is not thread-safe for some of the nu- tions of the individual variables appearing in the portfo-
merical algorithms we require. A solution is to im- lio. Thus for the securities Xk , the first principal com-
plement the system using PVM (Parallel Virtual Ma- ponent is of the form Y 1 = k a1k Xk with a1k cho-
chine, see Geist et al. [11]), which divides the com- sen to maximise the sample variance of Y 1 over all of
putation into separate system processes that communi- the simulation portfolios subject to a 1 a1 = 1. Further
cate by message passing. Separate library routine calls principal components Y j are defined similarly, but are
cannot then interfere with one another since they each also required to be orthogonal to each of the previous
have their own address space. The danger of the PVM principal components, thus a j ai = 0 for i < j, [13].
86 H.M. Liddell et al. / Parallel computing applications and financial modelling

Principal component analysis is essentially used to choose a portfolio which will follow the future market
reduce the dimensionality of a given problem. The independent of the choice of when, if ever, the UK joins
principal components are chosen so that the first com- the EURO. There are thus a number of scenarios which
ponent has the greatest impact on the analysis and each a manager might postulate, each giving a different value
subsequent component has a decreasing impact. We are and variance of that value independent of the actual
aiming to project the original data space onto a lower scenario for any given security. The task of the manager
dimensional space, whilst not losing too much infor- is to suggest a portfolio which independent of the actual
mation. To do this we need to be able to measure the scenario, will meet some given criterion, e.g. minimum
errors we are introducing by selecting the first r( n) risk, maximum return or some intermediate state.
principal components.
The above process was tested with equity data based
Applying principal component analysis to the covari-
on the FTSE 100 Index on the London Stock Exchange.
ance matrix (of the portfolio simulations) we see that
Even this relatively small data set has brought to light
the total variance of the original system is the sum of
the diagonal elements ω 12 + ω22 + . . . + ωn2 or the trace a number of detailed issues relating to the sensitivity
of the matrix, which we denote as s 2n . The variance which have to be investigated if the above process is
of the system that is ‘explained’ by the first r principal to be applied more widely and to much larger markets.
components is given by S r2 . A traditional measure of The use of Principal Component Analysis can highlight
the ‘goodness-of-fit’ of the subspace projection defined which predictions are important (and which are largely
by the first r principal components is the proportion irrelevant) to the portfolio optimisation. We have thus
Sr2 /Sn2 . This measure turns out not to be well-suited created a numerical approach for the inclusion of pre-
to our application, hence we have designed three alter- dictions of expectations of return in portfolios, along
native measures based on Euclidean distance, the error with linear and non-linear constraints.
in risk, and the error in return. These measures are
defined either as means (ERR 1 , ERR2 and ERR3 ) or
in least-squares form (ERR 4 , ERR5 and ERR6 ) over 4.1. Alternative definitions of risk
the simulations [10].
 
The traditional definition of risk is i j Xi Xj Cij ,
4. Financial risk assessment for portfolio where Xj is the proportion of security j in the port-
management folio, and Cij = ρij σi σj is the covariance between
securities i and j. A portfolio manager may well
This project aimed to build an understanding of compare performance with some benchmark portfo-
the stability of portfolios using non-linear optimisation lio (with components B j ), hence it is natural to
methods with stochastic inputs to develop the under- modify the above definition, so risk is defined as:
standing of appropriate stability criteria in conjunction  
i j Cij (Xi − Bi )(Xj − Bj ). Another use for this
with a large international bank as “end users”. Real type of definition is where the benchmark represents an
market data (March 1997 – March 2001) was used to existing portfolio, hence the use of this form of defini-
address the stability of solutions of these problems to tion gives a bias towards the existing portfolio.
variations in the holdings. This is important where
The need to determine various measures of risk and
the holder wishes to understand the sensitivity of the
return arises because of the non-deterministic inputs
portfolio holdings to minor variations in the holdings.
that our approach allows. The tool that we use to
These may correspond to proportions of holdings or to
variations in possible scenarios. It will also develop achieve this is again Principal Component Analysis.
further the understanding of real issues in the solution
of this class of non-linear optimisation problems on
HPC systems in the context of real economic systems. 5. The analysis of bond portfolios
Portfolio managers would like to choose portfolios
containing a mix of securities so that risk is controlled.
The future values of securities are uncertain and these In order to verify our approach we constructed a
values will change relative to each other depending on series of portfolios. The remaining sections of this
future events. A UK portfolio manager might want to paper briefly discuss the results obtained.
H.M. Liddell et al. / Parallel computing applications and financial modelling 87

Fig. 1. Bond performance with risk relative to benchmark.

Fig. 2. Typical sets of simulation portfolios.

5.1. The portfolio data Figure 1 shows the absolute performance of the se-
lected portfolios of bond instruments, but with risk de-
Tests are based upon bond index data collected over fined relative to a typical benchmark (with maturities
a period of approximately five years. The data is close weighted towards medium dated bonds).
of day prices for bond indexes with five different sets of The types of optimisations solved are maximum re-
maturities (from short to long dated) across internation-
turn and three intermediate risks (1/16, 1/8, 1/4 of maxi-
ally diverse markets (e.g. USA, UK, Japan, Australasia
mum risk). Also an upper limit of 5% is imposed on the
and Europe), together with deposit rates and currency
exchange rates. The period covered by the data runs proportion of an individual instrument in the selected
from 1993 to March 1998. The data was initially pre- portfolio. The quarters used are 1 December 1996 to
processed to remove spurious records from the time se- 1 December 1997, with predictions based on histori-
ries and to interpolate any missing values. Apart from cal observation of each preceding quarter or on actual
this little else was done in the way of preparation of the growth over the quarter. Both sets of results show pos-
data and we have no reason to believe that it is other itive performances, but the value of good predictions is
than a representative set of data. clear.
88 H.M. Liddell et al. / Parallel computing applications and financial modelling

Table 1
Risk of efficient portfilio 7.90E-4 7.43RE-04 6.87E-04
Risk of mean portfolio 7.04E-04 6.97E-04 6.78E-04
Return of efficient portfolio 0.43641 0.42894 0.42299
Return of mean portfolio 0.42797 0.42388 0.42108

efficient portfolio over 3 iterations


0.6

0.4

0.2

0
portfolio values

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
series1
-0.2 series2
series3
-0.4

-0.6

-0.8

-1
instruments

mean portfolio over 3 iterations


0.6

0.4
s

0.2

0
portfolio value

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
series1
-0.2 series2
series3
-0.4

-0.6

-0.8

-1
instruments

Fig. 3.

5.2. Sensitivity of portfolio to time-period of the the application of a model to assign significance to the
historic data data points used in the calculation of the portfolio. The
longer the time series used, the less relevant to more
The periods, from which data is taken, that are used recent portfolios are the data points obtained from the
in this process are very long. This has necessitated beginning of the series. In order to deal with this prob-
H.M. Liddell et al. / Parallel computing applications and financial modelling 89

lem we have looked at various methods of assigning popular securities. We will see these two plots repre-
significance to data points. The one we have chosen is sent the most volatile and least volatile respectively of
the scheme due to Joubert and Rogers [14], in which a the time-points we have considered; this will be con-
geometric weighting is applied. firmed by the principal component analysis.
It is clear that the choice of weighting scheme must Each simulation produces an efficient portfolio de-
not be such as to substantially reduce the significance termined by the perturbed input values. These effi-
of recent major events or to completely obviate those cient portfolios have a limited number of instruments
which may have occurred during the period covered. in each portfolio, but not necessarily the same volatile
We have tested this and other models and are satis- securities present in each such portfolio. On the other
fied that the model chosen and the detailed weighting hand, the mean portfolio constructed by taking means
scheme applied reflects this need. of each instrument over all the simulations, has many
Constraints have also been applied to our modelling more instruments present – the popular securities as
system. A less risky portfolio can be achieved through well as all of the volatile securities. The definition of
diversity. We have therefore chosen to limit bond hold- the mean portfolio ensures it is feasible (with respect
ings to a minimum of 0% and a maximum of 20% of to the bounds and linear constraints), but it is not nec-
the portfolio. Deposit holdings are bounded by ±1 essarily efficient. Figure 3 shows three typical simu-
(so “hedging” is allowed), but additional (linear) con- lation portfolios; different portfolios clearly show dif-
straints require that (positive) deposit holdings must be ferent securities present and volatility in the propor-
matched by bond holdings and the sum of the deposit tions present. The second plot shows the correspond-
holdings is zero to limit currency speculation and to ing mean portfolio, which has many more securities
prevent gearing. present than the corresponding efficient portfolio (the
conventional Markowitz optimal portfolio generated by
5.3. Principal component analysis with predictions the means of the scenario inputs).
from historic data
5.4. Refinement of the mean portfolio
The dataset used for the principal component anal-
ysis is that specified above. There are 59 instru- Having performed the simulations, the mean rates of
ments selected, comprising 10 countries (Australia, return and the mean of the standard deviations observed
Germany, Canada, Denmark, Japan, New Zealand, for the simulated scenarios (the Wiener processes) are
Sweden, Switzerland, UK, USA) with 10 deposit in- calculated. A further optimisation problem is then
struments and 49 bond index instruments. Points in solved using the observed means. The resulting portfo-
time have been chosen at monthly intervals from March lio is of course ‘efficient’. We also calculate the mean
97 to March 98. For each time-point, geometric weight- portfolio from the simulation portfolios; the mean port-
ing has been applied to the dataset to obtain the historic folio is not, in general, efficient. The efficient portfolio
returns, standard deviations and covariance matrix for will be mainly composed of the ‘popular’ securities,
the 59 instruments. Significant variation in behaviour while the mean portfolio will also contain many of the
of the instruments can be observed over this period. ‘volatile’ securities.
The stochastic inputs for these tests have been set to Principal Component Analysis will have suggested
the historic returns and standard deviations (with geo- that a number of the original instruments can be safely
metric weighting) calculated at each of the time-points. ignored. The ‘unpopular’ securities will have zero or
Figure 2 shows two typical sets of simulation portfolios. very small holdings in the mean portfolio. Clearly, a
Essentially there are three types of instruments present first step in removing the unwanted securities from the
in these portfolios:- the ‘popular’ securities which are mean portfolio is to remove the unpopular securities
present in most portfolios; the ‘unpopular’ securities with values below some small threshold (uneconomic
which are usually absent; and other ‘volatile’ securities holdings).
which are present or absent because of the volatility The effect of removing such holdings and adding
built into the model by the standard deviations of the constraints as above appears to be to bring the iterated
stochastic inputs. The first plot shows a limited num- mean portfolio closer to the iterated efficient portfolio
ber of popular and unpopular securities, with a large in risk/return space. Table 1 shows a typical example of
proportion of mid-height volatile securities; the second this behaviour by repeating the simulation runs so that
plot shows the portfolios concentrated into the more three sets of portfolios are available. Subtle variations
90 H.M. Liddell et al. / Parallel computing applications and financial modelling

in the efficient portfolios occur; analogous subtle vari- References


ations can also be observed in the impact in risk/return
space, where the risks and returns of the efficient and [1] H.M. Liddell and D. Parkinson, The measurement of perfor-
mance on a highly parallel processor, Trans. on Computers
mean portfolios can be seen to converge rapidly C-32 32 (1982).
[2] H.M. Liddell and G.S.J. Bowgen, The DAP Subroutine Li-
brary, Comput Physics Communicatuions 26(311) (1982).
6. Conclusions [3] H.M. Liddell and C.H. Lai, Finite elements using long vectors
of the DAP, Parallel Computing 8 (1988), 351–361.
[4] Organic Low Dimensional Structures (OLDS) ESPRIT BR
We have created a numerical approach for the inclu- 3200 30 months from 1/7/8 (collaborative with Universities of
sion of predictions of expectation of return in portfo- Edinburgh, Genoa, Max-Planck-Institute, Tubingen and Ernst-
lios, along with linear and non-linear constraints. We Leitz-Wetzlar GmbH)
[5] Multidimensional and Multimodal medical imaging and dis-
believe that we have been able to demonstrate that the play, collaborative IED project SERC GR/F34909, 1989–
method as described is capable of producing reliable 1991.
portfolios, which are stable against perturbation indi- [6] H.M. Liddell, D. Parkinson, P. Undrill et al., Integrated pre-
cating that the selection is at least close to optimal. sentation of 3D imagery and anatomical atlases, using a par-
allel processing system , SPIE Vol. 1653, Image Capture, For-
The timing of our tests indicates that in order to ob- matting and Display, 1992.
tain sufficient convergence it may be necessary to em- [7] H.M. Markowitz, Portfolio Selection, Journal of Finance 7(1)
ploy networked high-end PC technology, or other more (1952), 778.
[8] S.A. Zenios, Financial Optimization, Cambridgee University
powerful parallel architectures, for the solution of large Press, 1993.
problems. The technology has demonstrated its capa- [9] G.S. Hodgson, P. Dzwig, H.M. Liddell and D. Parkinson,
bility to include scenarios as inputs, dependent upon Application of HPC to Medium-size Stochastic Systems with
expected outcomes, and thus enhances the armoury of Non-linear Constraints in Finance, Lecture Notes in Computer
Science 1401 (1998), 411–418.
the portfolio manager. [10] G.S. Hodgson, H.M. Liddell, P. Dzwig and D. Parkinson,
Medium-size Stochastic Systems and Principal Component
Analysis in Portfolio Optimisation, paper presented to the
Acknowledgements European Working Group on Financial Modelling Meeting,
New York, November 2000.
[11] P. Gill, W. Murray and M. Wright, Practical Optimisation,
We would like to acknowledge the invaluable input Academic Press, 1981.
from Paul Conyers and Dominic Kini of ABN Amro [12] E. Geist et al., A users guide and tutorial for networked parallel
and from Bernard Fromson formerly of ABN Amro. computing, 1994.
[13] W.J. Krzanowski, Principles of Multivariate Analysis, Oxford
We also thank our former colleagues at the London University Press, 1990.
Parallel Applications Centre, Queen Mary College and [14] A.W. Joubert and L.C.G. Rogers, Ticking over in real time
at Kings College. We would also like to thank NAg in Quantitative and Computational Finance, UNICOM 1994,
Ltd. for their assistance, in particular David Sayers. pp. 144.

This work has been supported by the UK Engineering


and Physical Sciences Research Council under grant
number GR/M84282.
Advances in Journal of
Industrial Engineering
Multimedia
Applied
Computational
Intelligence and Soft
Computing
The Scientific International Journal of
Distributed
Hindawi Publishing Corporation
World Journal
Hindawi Publishing Corporation
Sensor Networks
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

Advances in

Fuzzy
Systems
Modelling &
Simulation
in Engineering
Hindawi Publishing Corporation
Hindawi Publishing Corporation Volume 2014 http://www.hindawi.com Volume 2014
http://www.hindawi.com

Submit your manuscripts at


Journal of
http://www.hindawi.com
Computer Networks
and Communications  Advances in 
Artificial
Intelligence
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014

International Journal of Advances in


Biomedical Imaging Artificial
Neural Systems

International Journal of
Advances in Computer Games Advances in
Computer Engineering Technology Software Engineering
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

International Journal of
Reconfigurable
Computing

Advances in Computational Journal of


Journal of Human-Computer Intelligence and Electrical and Computer
Robotics
Hindawi Publishing Corporation
Interaction
Hindawi Publishing Corporation
Neuroscience
Hindawi Publishing Corporation Hindawi Publishing Corporation
Engineering
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

You might also like