2021
Gradients of GAN Objectives
This technical post will offer a new view of common training objectives for generative adversarial networks (GANs), including a justification for the widely-used non-saturating loss.
The gradient of the discriminator First, let’s look at the original GAN loss function and show that it’s simpler than it looks. As defined in Goodfellow et al. (2014), it’s
@media (min-width: 558px) { .c_a031358542 { height: 28px; } } where is the data distribution, is the noise distribution, is the discriminator, and is the generator.
2020
Searching for Style
It’s interesting to watch young children use home assistants. They treat them like people, asking questions like “Alexa, how are you feeling?” and “Alexa, where are you?”. I often wonder what their motivation is. Maybe they don’t know the responses are scripted. Maybe the designers’ choice of which response to script is just as interesting to them as a human’s response would be.
Adults are no less inclined to anthropomorphization — Alexa is marketed as a her and not an it — but they’ve grown used to Alexa’s shortcomings as a conversationalist.
A Justification of the Cross Entropy Loss
A wise man once told me that inexperienced engineers tend to undervalue simplicity. Since I’m not wise myself, I don’t know whether this is true, but the ideas which show up in many different contexts do seem to be very simple. This post is about two of the most widely used ideas in deep learning, the mean squared error loss and the cross entropy loss, and how in a certain sense they’re the simplest possible approaches.
Let’s start with the mean squared error, which is used when you want to predict a continuous value.
2018
TabNine's first month in review
2020 Jacob here. This post was written back in 2018. Since this post, I added deep learning completions and sold the company. Now it’s in the capable hands of Codota.
It’s been roughly a month since TabNine launched on Hacker News and Reddit. I want to talk about how TabNine has been doing and share my plans for its future.
Pre-release development First, some background. I’m an undergraduate at the University of Waterloo. I started working on TabNine in February 2018, in my spare time while working as an intern at Jane Street.
If you enjoy this blog, subscribe on Twitter or by email: