This is part of the CNN Architectures series by Dimitris Katsios. Find all CNN Architectures online: Notebooks: MLT GitHubVideo tutorials: YouTubeSupport MLT on Patreon DenseNet We will use the tensorflow.keras Functional API to build DenseNet from the original paper: “Densely Connected Convolutional Networks” by Gao Huang, Zhuang Liu, Laurens van der Maaten, Kilian Q. Weinberger. In the paper we can read: [i] “Note that … Continue reading CNN ARCHITECTURES: DENSENET
This is part of the CNN Architectures series by Dimitris Katsios. Find all CNN Architectures online: Notebooks: MLT GitHubVideo tutorials: YouTubeSupport MLT on Patreon SqueezeNet We will use the tensorflow.keras Functional API to build SqueezeNet from the original paper: “SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size” by Forrest N. Iandola, Song Han, Matthew W. Moskewicz, Khalid Ashraf, … Continue reading CNN ARCHITECTURES: SQUEEZENET
Extract the most important information from a CNN paper and learn how to code up the architecture. This time: "Xception: Deep Learning with Depthwise Separable Convolutions"
What's the most important implementation information in a Deep Learning paper and how do we code it up? MLT Director Dimitris Katsios shows you exactly that with a series on CNN Architectures including notebooks, visualizations and videos. To kick off the series, Dimitris picked some of the earliest Convolutional Neural Network papers. This series will … Continue reading CNN ARCHITECTURES (1-5)
When we talk about Convolutional Neural Networks, that take a fixed size input and produce a fixed size output, we usually think of image processing. When it comes to Natural Language Processing, Recurrent Neural Networks come into play, since they allow us to operate over sequences of vectors. Now, a new trend in research is emerging, experimenting … Continue reading RNNs or CNNs for NLP?