💭

Gaussian process dynamical model

2024/01/28に公開

Let X = (x_1, x_2, \dots, x_N) denote the latent variables and Y = (y_1, y_2, \dots, y_N) denote the observations.

The graphical model is defined as follows:

... x_{n-1} --->  x_n --->  x_{n+1} ...
    |             |         |
    V             V         V
... y_{n-1}       y_n       y_{n+1} ...

The joint probability p(Y, X) = p(Y|X) p(X) .

p(Y|X) is defined in the same way as GPLVM.

\begin{align*} p(Y|X) = \frac{1}{(2 \pi)^{\frac{ND}{2}} |K_X|^{\frac{D}{2}}} \exp \left( -\frac{1}{2} \mathrm{tr}(K_X^{-1} Y Y^\top) \right) \end{align*}

p(X) is assumed as Markovian process,

\begin{align*} p(X) = \prod_{n=2}^N p(x_n | x_{n-1}) . \end{align*}

Let X_{1:N-1} = (x_1, x_2, \dots, x_{N-1}), X_{2:N} = (x_2, x_3, \dots, x_N) and X^{(q)} represents q-th column vector of X.

\begin{align*} p(X) &= p(X_{2:N} | X_{1:N-1}) p(x_1) \\ &= \prod_{q=1}^Q p(X_{2:N}^{(q)} | X_{1:N-1}) \, p(x_1) \\ &= \prod_{q=1}^Q \mathcal{N}(X_{2:N}^{(q)} | 0, K_{X_{1:N-1}}) \, \mathcal{N}(x_1 | 0, I) \\ &= \frac{1}{(2 \pi)^{\frac{(N-1) Q}{2}} |K_{X_{1:N-1}}|^{\frac{Q}{2}}} \exp \left( -\frac{1}{2} \mathrm{tr} (K_{1:N-1}^{-1} X_{1:N-1} X_{1:N-1}^\top) \right) \frac{1}{(2 \pi)^{\frac{Q}{2}}} \exp \left( - \frac{1}{2} x_1^\top x_1\right) . \end{align*}

The log probability \mathcal{L}(X) = \ln p(Y|X) p(X) is given by,

\begin{align*} \mathcal{L}(X) &\propto - \frac{D}{2} \ln |K_X| - \frac{1}{2} \mathrm{tr}(K_X^{-1} Y Y^\top) - \frac{Q}{2} \ln |K_{1:N-1}| - \frac{1}{2} \mathrm{tr} (K_{1:N-1}^{-1} X_{2:N} X_{2:N}^\top) - \frac{1}{2} x_1^\top x_1 . \end{align*}

Discussion