We created a small repository linking to open Deep Learning and Reinforcement Learning lectures provided by MIT, Stanford University and UC Berkeley. ⚪ MIT 6.S191: Introduction to Deep Learning | 2020⚪ CS231n: CNNs for Visual Recognition, Stanford | Spring 2019⚪ CS224n: NLP with Deep Learning, Stanford | Winter 2019⚪ CS285: Deep Reinforcement Learning, UC Berkeley | … Continue reading AI CURRICULUM
MLT is presenting 2 workshop papers at the NeurIPS 2019 conference in Vancouver, Canada. Meet Asir Saeed at the poster session at the NeurIPS Workshop Machine Learning for Creativity and Design. Creative GANs for generating poems, lyrics, and metaphors Asir Saeed, Suzana Ilić, Eva Zangerle Generative models for text have substantially contributed to tasks like machine translation … Continue reading MLT AT NEURIPS 2019 IN VANCOUVER
Keith Stevens (Google Japan) gave an insightful talk about Neural Machine Translation at Google, from Sequence to Sequence Models to Transformers and Hybrids, and an excellent overview of the latest research. He also presented a sample of unsolved problems: Pushing the limits of model size Working with non-ideal data Translating on the fly Using models … Continue reading TALK: GOOGLE TRANSLATE
Machine Learning Tokyo is co-hosting an event on April 3 for those, who are interested in Deep Learning and Natural Language Processing. We are excited to welcome two experts in the field, coming all the way from Tel Aviv, Israel and Melbourne, Australia. Deep Learning for NLP First, Sam Davis (CEO amplified.ai) kicks off the … Continue reading EVENT: DEEP LEARNING FOR NLP
When we talk about Convolutional Neural Networks, that take a fixed size input and produce a fixed size output, we usually think of image processing. When it comes to Natural Language Processing, Recurrent Neural Networks come into play, since they allow us to operate over sequences of vectors. Now, a new trend in research is emerging, experimenting … Continue reading RNNs or CNNs for NLP?