Keith Stevens (Google Japan) gave an insightful talk about Neural Machine Translation at Google, from Sequence to Sequence Models to Transformers and Hybrids, and an excellent overview of the latest research.

He also presented a sample of unsolved problems:

  • Pushing the limits of model size
  • Working with non-ideal data
  • Translating on the fly
  • Using models to find data

Keith Stevens has been on Google Translate for nearly 7 years. His primary focus has been finding or creating the best data possible for improving Google Translate. This work has ranged from Crowdsourcing systems to neural based data cleaning techniques and automated parallel data mining.


This event was co-organized by Le Wagon Tokyo and Machine Learning Tokyo.

This slideshow requires JavaScript.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s