Amortized Inference in Latent Space Energy-Based Prior Model

Xufan Zhai
MS, 2022
Wu, Yingnian
This thesis discusses amortized inference in the latent space energy-based prior model(EBM), where the EBM serves as the prior of a generator neural network. The sampling of the prior and posterior can be done by short-run MCMC, however, the MCMC sampling of the posterior distribution can be time consuming due to the complexity of the posterior distribution. We propose to amortize the MCMC sampling in the posterior distribution with an inference network. Image experiments showed that amortization produces similar results to short-run MCMC sampling and is more time efficient; the generator also shows better stability under amortization.
2022