Gaussian Mixture Model Intuition amp Introduction TensorFlow Probability
YOUR LINK HERE:
http://youtube.com/watch?v=atDp5bkzej4
GMMs are used for clustering data or as generative models. Let's start with understanding by looking at a one-dimensional 1D example. Here are the notes: https://raw.githubusercontent.com/Cey... • If your (univariate) distribution has more than one mode (peaks), there is a good chance you can model it with a Gaussian Mixture Model (GMM), a Mixture Distribution of Gaussian/Normal. That is helpful for a soft clustering of points in one dimension. For this you select the number of modes you expect (= the number of peaks). This will then correspond to the number of (latent) classes as well as the number of Gaussians that have to be defined. • In this video, I provide an intuition to this by looking at the grade distribution after an exam, with a first peak at 2.5 and a second peak at the grade corresponding to a fail. We will implement this model in TensorFlow Probability. • ------- • 📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-lea... • 📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning Simulation stuff: / felix-koehler and / felix_m_koehler • 💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim • ------- • Timestamps: • 00:00 Introduction • 00:38 A Multi-Modal Distribution • 01:10 Clustering of Points • 02:04 A Superposition of Gaussians? • 03:59 Using Mixture Coefficients • 05:05 A special case of Mixture Distributions • 05:33 The Directed Graphical Model • 07:52 Alternative Model with plates • 08:45 The joint • 10:28 TFP: Defining the Parameters • 11:27 TFP: The Categorical • 12:12 TFP: The batched Normal • 13:13 TFP: GMM in Principle • 14:13 TFP: Using the TFP Mixture Distribution • 15:15 TFP: Plotting the probability density • 17:05 Outro
#############################
