Input → continuos latent variables
condition - intractable posterior distributions + large datasets
TO DO- efficient inference + learning in directed probabilistic models
::introduce - variational inference algo
→ can reframe ELBO
learned approx posterior distribution can be used for → tasks like
→ recognition, denoising, representation and visualization
→ When NN is used for recognition- we get Variational AutoEncoder ( VAE)
q -encoder - generative model
p - decoder - variational approximation
q is approx of intractable true posterior
regularization -
autoencoder
probabilistic model
stochastic variational inference
→
→