Keith Stevens (Google Japan) gave an insightful talk about Neural Machine Translation at Google, from Sequence to Sequence Models to Transformers and Hybrids, and an excellent overview of the latest research. He also presented a sample of unsolved problems: Pushing the limits of model size Working with non-ideal data Translating on the fly Using models … Continue reading TALK: GOOGLE TRANSLATE


Machine Learning Tokyo is co-hosting an event on April 3 for those, who are interested in Deep Learning and Natural Language Processing. We are excited to welcome two experts in the field, coming all the way from Tel Aviv, Israel and Melbourne, Australia. Deep Learning for NLP First, Sam Davis (CEO kicks off the … Continue reading EVENT: DEEP LEARNING FOR NLP

RNNs or CNNs for NLP?

When we talk about Convolutional Neural Networks, that take a fixed size input and produce a fixed size output, we usually think of image processing. When it comes to Natural Language Processing, Recurrent Neural Networks come into play, since they allow us to operate over sequences of vectors. Now, a new trend in research is emerging, experimenting … Continue reading RNNs or CNNs for NLP?