Dynamic Bayesian Networks

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Dynamic bayesian networks

Introduccin

En modelamiento de series de tiempo, observamos ciertas variables en


diferentes puntos en el tiempo.

Considere el problema de tratar un paciente diabtico. Tenemos evidencia


de dosis recientes de insulina, consumo de comida, medidas de azcar en
la sangre y otros signos fsicos. La tarea es evaluar el estado actual de un
paciente, dada la informacin. Los aspectos dinmicos del problema son
esenciales para predecir tratamientos o estados a partir del historial de
evidencias.

Introduccin

El proceso de cambio puede ser visto como una serie de snapshots, cada
uno describe el estado del mundo en un tiempo particular. cada time slice
o snaptshop contiene un conjunto de variables aleatorias, algunas
observables y otros no.

Introduccin

Sequential data comes in two main forms: temporal (time-series) data,


which is generated sequentially by some causal process, and sequence
data (e.g., bio-sequences or natural language), where we are more
agnostic about the generating mechanism. Kevin Murphy

Introduccin

For modelling time-series data, it is natural to use directed graphical


models, which can capture the fact that time flows forward. Arcs within a
time-slice can be directed or undirected, since they model instantaneous
correlation. If all arcs are directed, both within and between slices, the
model is called a dynamic Bayesian network (DBN). (The term dynamic
means we are modelling a dynamic system, and does not mean the graph
structure changes over time.) DBNs are quite popular because they are
easy to interpret and learn: because the graph is directed, the conditional
probability distribution (CPD) of each node can be estimated
independently. Kevin Murphy

DBN
Dynamic Bayesian Networks (DBNs) are directed graphical models of
stochastic processes. They generalise hidden Markov models (HMMs) and
linear dynamical systems (LDSs) by representing the hidden (and observed)
state in terms of state variables (can be discrete or continuous), which can
have complex interdependencies. The graphical structure provides an easy
way to specify these conditional independencies, and hence to provide a
compact parameterization of the model. Kevin Murphy

DBN-Representacin
Changes caused by a stationary process. Markov assumption:

Inteligencia artificial: Un enfoque moderno de Stuart Russel, Peter Norving

DBN-Representacin

Inteligencia artificial: Un enfoque moderno de Stuart Russel, Peter Norving

Inferencia en modelos temporales

Filtering or monitoring: p(Xt|e1:t)

Prediction: p(Xt+k|e1:t)

Smoothing or hindsight:p(Xk|e1:t)

Most likely explanation: p(x1:t|e1:t)

Inferencia en modelos temporales

Inferencia en modelos temporales

Inferencia en modelos temporales

Inferencia en modelos temporales

Filtering or Monitoring

Filtering or monitoring: p(Xt|e1:t)

Prediction

Prediction: p(Xt+k|e1:t)

Smoothing or hindsight

Smoothing or hindsight:p(Xk|e1:t)

BackWard

Hidden Markov Models(HMMs)

Modelo probabilstico temporal en el cual los estados del proceso estn


descritos por una sola variable aleatoria discreta.

Los posibles valores de la variable son los posibles estados del mundo.

Hidden Markov Models(HMMs)

Hidden Markov Models(HMMs)

Irene Tischer, Slides Razonamiento probabilstico

Hidden Markov Models(HMMs)

Inteligencia artificial: Un enfoque moderno de Stuart Russel, Peter Norving

Hidden Markov Models(HMMs)

Inteligencia artificial: Un enfoque moderno de Stuart Russel, Peter Norving

Hidden Markov Models(HMMs)

HMM Mixture of Gaussians output

Auto-regressive HMMs

Donde Wi es la matriz de regresin dado que Qt esta en el estado i.

Mixed-memory Markov models

Mixed-memory Markov models

Factorial HMMs

You might also like