20 Machine Learning engineers joined us for the second Machine Learning Tokyo Workshop on GANs in Shibuya. We looked at theoretical aspects of Generative Adversarial Networks, went over an illustrative code example in PyTorch on how to create fake data from real data, in this case a Gaussian distribution, and finally we implemented a GAN architecture ourselves, experimenting with different datasets and hyperparameters.

Competing Networks

What are GANs? In Deep Learning, we distinguish between discriminative models (e.g. classification of images, text, …) and generative models (generating images, text, …). Training GANs in an adversarial setting means training two models simultaneously and iteratively: the generator G, a generative model, that aims to replicate a data distribution (starting from random noise), which is then fed to the Discriminator D, a discriminative model, that estimates the probability of this data distribution coming from the training data rather than G, by simple classification (fake or real, 0 or 1).

Read the original paper by Ian Goodfellow:
Generative Adversarial Networks

There are many different variations of GANs and the number of research papers is growing exponentially. Among the most interesting architectures are DCGANs: Deep Convolutional Generative Adversarial Networks [paper], Cycle GANs: Cycle-Consistent Adversarial Networks [paper] and WGANs: Wasserstein GANs [paper], just to name a few.

NVIDIA’s training of Progressive GANs [paper] by incrementally increasing the number of dimensions (and a lot of GPU power) has gained much attention with some impressive results:

The other stream of research is dedicated to finding new use cases and application areas for GANs. Our next workshop on more complex architectures of GANs will be held in July with a follow-up event for participants who want to present their work, aiming to find new and creative uses cases and datasets.






This slideshow requires JavaScript.

We have published all resources including the presentation slides and code (implementations in PyTorch and Keras) on our MLT GANs website. Explore, experiment and share your thoughts. [MLT GANs]


Join us for the next session

Machine Learning Tokyo

Tokyo, JP
1,437 ML engineers

We live in a fast-paced world. To keep up, a bunch of tech-enthusiasts with diverse backgrounds are meeting up regularly to dive into, learn, discuss and apply something, that…

Next Meetup

Machine Learning Tokyo (study group)

Saturday, Dec 8, 2018, 2:00 PM
25 Attending

Check out this Meetup Group →


Get in touch with the team

Gregorio Nuevo Castro

Dimitris Katsios

Mustafa Yagmur

Suzana Ilić


Leave a Reply