Professional Documents
Culture Documents
Multidimensional Poverty Analysis For Odisha by ANN
Multidimensional Poverty Analysis For Odisha by ANN
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 1 / 29
Contents
1 Introduction
2 Preliminaries of ANN
4 Case study
5 Concluding Remarks
6 References
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 2 / 29
Poverty is the worst form of
violence.
Mahatma Gandhi
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 3 / 29
Introduction
There is one in every ten persons in this world lives below the international
poverty line. That would be around 689 million individuals who are surviving
on less than $1.90 per day, and about 2 billion (26.2 per cent of total popu-
lation) are living on less than $3.20 per day.
From 2015 to 2018, global poverty maintained its historic drop, with the global
poverty rate declining from 10.1 percent in 2015 to 8.6 percent in 2018.
But, due to COVID-19, the global poverty rate increased abruptly from 8.3
percent in 2019 to 9.2 percent in 2020, marking the first increase in extreme
poverty since 1998 and the highest increase since 1990, and forcing down
poverty reduction plans by approximately three years.
It is also worth mentioning that there exist various machine learning methods
to handle the multidimensional data in which Artificial Neural Network (ANN)
is a versatile method for developing models that can predict complex poverty
patterns from given data.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 4 / 29
Introduction (cont…)
[Erenstein et al., 2010] used natural, physical, human, social, and financial
livelihood assets as poverty indicators in Indo-Gangetic Plains of India, in
which, they have applied PCA and linear regression to predict poverty at the
district level.
[Pareek and Prema, 2012] proposed a methodology using MLNN to classify
poverty into BPL and non-BPL categories.
[Azcarraga and Setiono, 2018] used MLNN to classify between poor and non-
poor and identified characteristics that cause poverty.
[Alsharkawi et al., 2021] did a case study in Jordan to classify poor households
using sixteen different machine learning techniques.
[Liu et al., 2021] introduced the novel DBSCAN clustering algorithm via the
edge computing-based deep neural network model for targeted poverty allevi-
ation.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 5 / 29
Artificial Neural Network (ANN)
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 6 / 29
Multi-layer Neural Network (MLNN)
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 7 / 29
Multi-layer Neural Network (MLNN) (cont…)
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 8 / 29
Multi-layer Neural Network (MLNN) (cont…)
where x is input data say n features and m entries, i.e. n × m matrix. W(l)
is weight matrix of the l-th layer, b(l) is bias vector of the l-th layer, z(l) is
weighted sum of l-th layer, a(l) is activation of l-th layer, and g is activation
function.
Backward propagation: [Rumelhart et al., 1986]
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 9 / 29
Multi-layer Neural Network (MLNN) (cont…)
where J(θ) is loss function to measure the error between predicted and actual
output, θ represents parameters of neural network (i.e., the weight matrices
and bias vectors), ⊙ is element-wise multiplication, and α is the learning rate.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 10 / 29
Functional Link Neural Network (FLNN)
Input layer has n nodes, as input variable x1 , x2 , ..., xn , hidden layer has m
nodes, zj , j = 1, 2, ..., m, and y is output of FLNN.
wj,i is the weight connecting the i-th input node to the j-th hidden layer node,
bj is bias term of j-th node, f(i) is non-linear function applied to i-th input,
and g is activation function.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 11 / 29
Functional Link Neural Network (FLNN) (cont…)
Backward propagation:
E = 0.5 × (d − y)2 (8)
Here, d is desired output, E is MSE loss function between predicted output y
and desired output d.
∂E ∂E ∂y ∑ m
= = (y − d)g′ ( w1,j zj )zj
∂w1,j ∂y ∂w1,j
j=1
∂E ∂E ∂y ∂zj ∑ m
= = (y − d)g′ ( w1,j zj ) (9)
∂bj ∂y ∂zj ∂bj
j=1
∂E ∂E ∂y ∂zj ∑ m
= = (y − d)g′ ( w1,j zj )w1,j f′ (i)xi
∂wj,i ∂y ∂zj ∂wj,i
j=1
∂E
w(t + 1) = w(t) − η
∂w(t)
(10)
∂E
b(t + 1) = b(t) − η
∂b(t). . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 12 / 29
Functional Link Neural Network (FLNN) (cont…)
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 13 / 29
Challenges in Applying the Model
Data quality: One challenge in using ML for poverty analysis is ensuring that the data used
to train and evaluate the models is of high quality.
Interpretability: Another challenge in use of ML for poverty analysis is that some machine
learning algorithms are difficult to interpret.
Privacy and ethics: ML models for poverty analysis may also raise ethical and privacy
concerns.
Uncertainty: There may be uncertainty in the data due to which the values or measurements
in a dataset may deviate from their true or accurate values.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 14 / 29
Case study
Comparative study of Chebyshev and Legendre Polynomial-based NN Models for
Approximating Multidimensional Poverty for Odisha
Method
We have taken FLNN model with 12 input nodes, one functional expansion hidden layer
having 24 neurons in it, and for function expansion of FLNNs, second-order Chebyshev
polynomial (Eq. 11) for ChNN and second-order Legendre polynomial (Eq. 12) for LeNN
are taken so that it expands twelve input parameters to twenty-four parameters.
C0 (x) = 1
C1 (x) = x (11)
C2 (x) = 2x2 − 1
L0 (x) = 1
L1 (x) = x
(12)
1
L2 (x) = (3x2 − 1)
2
The activation function used in nodes between layers is SeLu.
The MSE with L2 regularizer as penalty to avoid over-fitting is utilised for error computation
in MLNN and no regularizer used for FLNNs. . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 15 / 29
Case study (cont…)
Comparative study of Chebyshev and Legendre Polynomial-based NN Models for
Approximating Multidimensional Poverty for Odisha
Indicators
Nutrition
Child-Adolescent Mortality
Maternal Health
Years of Schooling
School Attendance
Cooking Fuel
Sanitation
Drinking Water
Electricity
Housing
Assets
Bank Account
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 16 / 29
Case study (cont…)
Comparative study of Chebyshev and Legendre Polynomial-based NN Models for
Approximating Multidimensional Poverty for Odisha
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 17 / 29
Case study (cont…)
Comparative study of Chebyshev and Legendre Polynomial-based NN Models for
Approximating Multidimensional Poverty for Odisha
Results
It is found that ChNN and LeNN are computationally faster and better approximators than
MLNN.
ChNN took 47.49 seconds, followed by LeNN with 48.13 seconds and ANN with 48.41
seconds using CPU (Intel(R) Core(TM) i5-10505 CPU @ 3.20GHz, 8.00 GB).
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 19 / 29
Case study (cont…)
Comparative study of Chebyshev and Legendre Polynomial-based NN Models for
Approximating Multidimensional Poverty for Odisha
ChNN obtained training accuracy of 98.29% and validation accuracy of 83.25%, followed
by LeNN with a training accuracy of 97.01% and validation accuracy of 80.79% and MLNN
with a training accuracy of 96.75% and validation accuracy of 78.47%.
The MSE of ChNN is found 0.3145, followed by 0.3667 of ANN and 0.3743 of LeNN. The
outputs of all three models with the original output is given in Table 2.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 20 / 29
Concluding Remarks
The development of CMPI is free from any given weights as it was the case
with MPI [Aayog, 2021], it is purely based on the data given and is unsu-
pervised technique to determine the multidimensional poverty status of the
districts of the state.
This work can be further extended to India level and can be adopted by any
region in world as DHS survey data is available for almost all developing
countries.
The above methodology can also be applied to various other types of multi-
dimensional data, for example, rain data, farming data, etc..
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 21 / 29
References
[2] [[Alsharkawi et al., 2021] ]Alsharkawi, A., Al-Fetyani, M., Dawas, M., Saadeh,
H., and Alyaman, M. (2021).
Poverty classification using machine learning: The case of jordan.
Sustainability, 13(3):1412.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 22 / 29
References (cont…)
[4] [[Erenstein et al., 2010] ]Erenstein, O., Hellin, J., and Chandna, P. (2010).
Poverty mapping based on livelihood assets: A meso-level application in the
indo-gangetic plains, india.
Applied Geography, 30(1):112–125.
[5] [[Glorot et al., 2011] ]Glorot, X., Bordes, A., and Bengio, Y. (2011).
Deep sparse rectifier neural networks.
In Gordon, G., Dunson, D., and Dudík, M., editors, Proceedings of the Four-
teenth International Conference on Artificial Intelligence and Statistics, vol-
ume 15 of Proceedings of Machine Learning Research, pages 315–323, Fort
Lauderdale, FL, USA. PMLR.
[8] [[Liu et al., 2021] ]Liu, H., Liu, Y., Qin, Z., Zhang, R., Zhang, Z., and Mu, L.
(2021).
A novel dbscan clustering algorithm via edge computing-based deep neural net-
work model for targeted poverty alleviation big data.
Wireless Communications and Mobile Computing, 2021:1–10.
[9] [[Pao and Takefuji, 1992] ]Pao, Y.-H. and Takefuji, Y. (1992).
Functional-link net computing: theory, system architecture, and functionalities.
Computer, 25(5):76–79.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 24 / 29
References (cont…)
[11] [[Rumelhart et al., 1986] ]Rumelhart, D. E., Hinton, G. E., and Williams,
R. J. (1986).
Learning representations by back-propagating errors.
nature, 323(6088):533–536.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 25 / 29
Publication from this study
[1] Kumar, S., & Sahoo, Arup Kumar, & Chakraverty, S. (2022). Comparative
study of Chebyshev and Legendre polynomial-based neural models for
approximating multidimensional poverty for an Indian State.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 26 / 29
Acknowledgement
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 27 / 29
Books Related to ANN
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 28 / 29
Thank You
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
Arup May 27, 2023 29 / 29