One of the goals of this series of posts is to explain methods that many researchers are using without really knowing the underlying theory. The topic of today’s post is the Sinkhorn iterative algorithm for efficiently solving regularized optimal transport problems. This algorithm is one of the main driving forces behind the current wave of […]

## An Intuitive Guide to Optimal Transport, Part II: The Wasserstein GAN made easy

No guide to optimal transport for machine learning would be complete without an explanation of the Wasserstein GAN (wGAN). In the first post of this series I explained the optimal transport problem in its primal and dual form. I concluded the post by proving the Kantorovich-Rubinstein duality, which provides the theoretical foundation of the wGAN. In […]

## An Intuitive Guide to Optimal Transport, Part I: Formulating the Problem

Following the success of the Wasserstein GANs and of the Sinkhorn divergences, Optimal transport theory is rapidly becoming an essential theoretical tool for machine learning research. Optimal transport has been used for generative modeling, probabilistic autoencoders, variational inference, reinforcement learning and clustering, among many other things. In other words, if you are a machine learning […]

## Joint-Contrastive Inference and Cycle GANs

This is the second post of our series about joint-contrastive inference. I suggest to read our previous post and the seminal blog post by Ferenc Huszár for the required background. This post is partially based on the probabilistic reformulation of cycle-consistent GANs as introduced in the paper: Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference. In my […]

## The Fourier Transform through the Lens of Gaussian Process Regression

A limited number of mathematical concepts escaped their original narrow field of application and became an essential conceptual tool throughout mathematics, physical sciences and engineering. Some of these tools came from the Egyptians and the Babylonians and are now taught in primary school: integers, fractions, sums and so on. Others arose in the 17th century […]

## Joint-Contrastive Inference and Model-Based Deep Learning

In this post I will discuss joint-contrastive variational inference, a new form of stochastic variational inference that is gaining traction in the machine learning community. In this and in following posts, I will use the joint-contrastive inference framework in order to show that several commonly used deep learning methods are actually Bayesian inference methods in […]

## Information is in the Eye of the Beholder

The formalization of information theory is arguably one of the most important scientific advancements of the twentieth century. In some sense, the most important scientific breakthroughs of the last century, such as the discovery of DNA and quantum mechanics, can be interpreted in terms of information. Information theory also plays a major role in machine […]