Week 2 - VAE - Lesson
Week 2 - VAE - Lesson
Week 2 - VAE - Lesson
Visual Generative
AI Application
Variational Autoencoder (VAE)
Week 2
AY 24/25
S PEC IALIST D IPLOMA IN APPLIE D GE NE RATIV E AI ( S D GAI)
Official Open
Official Open
Official Open
Objectives
• By the end of this module, learners will be able to
• Describe the concepts of variational autoencoder
• Train VAE model to generate output
Official Open
Data
distribution
Official Open
Given
Data
distribution
We want to generate
new samples that are
“similar” to the given
data distribution.
Some
Generated
initial
Generative Model Data
conditions/
distribution
distribution
Official Open
Autoencoder
Encoder
Input Data
Official Open
No labels used.
Train such that features
can be used to Decoder
reconstruct original data
Latent Feature
Encoder
Input Data
Official Open
Decoder
Latent Feature
Encoder
Input Data
Diederik P. Kingma and Max Welling (2019), “An Introduction to Variational Autoencoders”, Foundations and Trends in Machine Learning
Official Open
Sample
• Decoder generates an output based
on this sampled latent space. Samples of
Estimates
Distribution
• The encoder attempts to estimate
the parameter of the (assumed)
distribution
Encoder Decoder
Official Open
Encoder
Estimate mean and
pass and the covariance
backpropagation to update
the weights
Backpropagation
Forward Pass
Sampled Latent
Space
Construct image
decoder
based on sampled
parameters
Output Image
Official Open
Construct image
based on sampled
parameters
Output Image
Practical 2
Variational Autoencoders
Official Open
Official Open
Official Open
Official Open
Official Open
Official Open