Optimization over the Space of Probability Measures
Reversible, Conductance and Rapid Mixing of Markov Chains

Reversible, Conductance and Rapid Mixing of Markov Chains

Introduction

Markov Chain

Let $(\Omega,\mathcal{F})$ be a measurable space, $P:\Omega\times \mathcal{F}\to [0,1]$ be a transition kernel, that is,

  • for every $x\in \Omega$, $P(x,\cdot)$ is a probability measure on $\mathcal{F}$;
  • for every $A\in \mathcal{F}$, $x\mapsto P(x,A)$ is measurable.
Read more
Introduction to Flow Matching

Introduction to Flow Matching

In generative modeling, we are given a collection of training samples $\{x_i\}_{i=1}^N$ and wish to generate new samples from the underlying target distribution $\pi$. There are already many established approaches to this problem, including likelihood-based methods, implicit generative models such as GANs, and score-based diffusion models. More recently, the flow matching framework has emerged as another powerful paradigm. In what follows, we introduce the basic ideas of flow matching and explain how works.

Read more
Sampling and Diffusion Model
Analogy between Sampling and Optimization