Professional Documents
Culture Documents
Stock Trend Prediction With Neural Network Techniques
Stock Trend Prediction With Neural Network Techniques
NETWORK TECHNIQUES
Seminar Presentation
Mohd Haris Lye Abdullah
haris.lye@mmu.edu.my,
Supervisor
Professor Dr Y. P. Singh
y.p.singh@mmu.edu.my
Outline
1. Introduction/Research Objective
2. Stock Trend Prediction
3. Neural network
4. Support vector machine
5. Feature selection
6. Experiments and Result
7. Conclusion
Objectives
a) Evaluate the performance of the neural network
techniques on the task of stock trend prediction.
Multilayer Perceptron (MLP), Radial Basis Function (RBF)
network and Support Vector Machine (SVM) are
evaluated.
Data from
daily
historical Increment Achievable ??
data Classifier
converted
into
Yes / No
technical
analysis
indicator
Classification Vs Forecasting
Forecasting
Predict actual future value
Classification
Assign pattern to different class categories.
Input to Classifier
Classification
The prediction of stock trend is formulated as a two class
classification problem.
Yi =-1
Yi =+1
Prediction Formulation
Regression
In the regression approach, the target output is
represented by a scalar value yr that represents the
predicted maximum excess return within the period days
ahead.
Neural Network
According to Haykin, S. (1994), Neural Networks: A
Comprehensive Foundation, NY: Macmillan, p. 2:
A neural network is a massively parallel distributed
processor that has a natural propensity for storing
experiential knowledge and making it available for use.
Knowledge is acquired by the network through a
learning process either supervised learning or
unsupervised learning.This paper use supervised
learning where the training pattern and it’s target
pattern are presented to the neural network during the
learning process.
Neural Network
Advantages of Neural Networks
The advantages of neural networks are due to its adaptive and
generalization ability.
MLP Structure
y
Multilayer Perceptron (MLP)
Training MLP Network
The multilayer perceptron (MLP) network uses the back
propagation learning algorithm to obtain the weight of the
network.
Simple back propagation algorithm use the steepest gradient
descent method to make changes to the weights.
The objective of training is to minimize the training mean square
error Emse for all the training patterns.
Class 2
Class 1
Support Vector Machine
Class 2
Class 1
m
||w||2 + C
i 1
p
||w||2 + C+ + C-i: y (
i) 1
i: y i 1
b) Regressor
Parameter for the -insensitive loss function
Regularisation constant C
Kernel parameter
Feature Selection
Feature selection is a process whereby a subset of the potential
predictor variables are selected based on a relevance criterion in
order to reduce the input dimension.
Step 1,2 and 3 are repeated until the stopping criterions are met
such as when the minimum number of features is included or
minimum accepted prediction accuracy achieved.
Feature Selection
General Approach for Feature Selection
a) Wrapper approach
The wrapper approach makes use of the induction algorithm
to evaluate the relevance of the features.
Relevance measure is based on solving the related problem,
usually the prediction accuracy of the induction algorithm when
the features are used.
b) Filter approach
Filter method selects the feature subset independent of the
induction algorithm. Features correlation is usually used.
Feature Selection
Feature Subset Selection
Train Test