Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

Research Proposal

1. Area(s) of Interest:
Bayesian Machine Learning:
Exploring Bayesian optimization methods in areas such as hyperparameter tuning and neural
network architecture search. Alternatively, applying Bayesian optimization to practical deep
learning tasks, investigating novel neural network structures that fuse deep learning and
Bayesian methods, and exploring applications in transfer learning and reinforcement learning.
References:
Jamieson, K., Talwalkar, A. (2016). "Scalable Bayesian Optimization Using Deep Neural
Networks." [arXiv:1502.05700](https://arxiv.org/abs/1502.05700)
Snoek, J., Larochelle, H., Adams, R. P. (2012). "Practical Bayesian Optimization of Machine
Learning Algorithms." In Advances in Neural Information Processing Systems 25 (NeurIPS
2012), NeurIPS Proceedings.
Snoek, J., Larochelle, H., Adams, R. P. (2012). "Practical Bayesian Optimization of Machine
Learning Algorithms." [arXiv:1206.2944](https://arxiv.org/abs/1206.2944)

Markov Chain Monte Carlo (MCMC):


Applying MCMC to machine learning or designing MCMC algorithms suitable for large-
scale Bayesian models to address challenges posed by large datasets and increasing model
complexity.
References:
Andrieu, C., Doucet, A., Holenstein, R. (2010). "Markov Chain Monte Carlo and Variational
Inference: Bridging the Gap." *Journal of the Royal Statistical Society: Series B (Statistical
Methodology)*, [DOI:10.1111/j.1467-
9868.2009.00765.x](https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-
9868.2009.00765.x)
Green, J. H. W., Osborne, M. A. (2015). "Scaling Metropolis-Hastings for Bayesian Neural
Networks with Data Subsampling." [arXiv:1411.0518](https://arxiv.org/abs/1411.0518)
Gershman, S. J., Blei, D. M. (2012). "Scalable Bayesian Inference for the Inverse Ising
Model." [arXiv:1203.3492](https://arxiv.org/abs/1203.3492)

2. Reasons for Interest:


During my master's studies in machine learning, I realized that I am more interested in the
theoretical aspects and improvements of models than coding itself. As Bayesian methods are
frequently employed in machine learning, my interest grew towards optimizing learning
models using Bayesian approaches. After delving into relevant literature, I came across
MCMC models, which I found to be widely applicable in probability models, statistics,
machine learning, and other domains. MCMC proves to be a fascinating and valuable tool
when dealing with probability distributions, conducting Bayesian inference, or training
complex models.

3. Applications in Mind:
I am particularly interested in applying machine learning in environmental studies and
finance. However, I am open to exploring applications in other domains as well.

You might also like