This short course introduces the fundamentals and applications of deep learning. It covers a wide range of topics, including basics of machine learning, deep neural networks, pre-training and generative models, Convolutional Neural Network (CNN), Capsule Networks, Recurrent Neural Networks (RNN), and deep-learning applications. The goal of the course is to provide conceptual understanding of deep learning with a holistic view and latest developments in the field. There is no prerequisite in machine learning and it only requires some basic math background of calculus and linear algebra.
1. Introduction to Machine Learning: (1) basic idea of machine learning; (2) concept of artificial neural network; (3) neural network training process.
2. Deep Neural Networks (DNN): (1) motivations for deep architectures; (2) input representations; (3) softmax; (4) maxout/ReLU; (5) dropout; (6) batch normalization.
3. Pre-training and Generative Models: (1) unsupervised pre-training; (2) Restricted Boltzmann Machines (RBM); (3) auto-encoders; (4) Variational Auto-Encoder (VAE); (5) Generative Adversarial Network (GAN).
4. Convolutional Neural Network (CNN): (1) concept of CNN; (2) convolution operation; (3) pooling; (4) residual network; (5) inception network; (6) network design.
5. Capsule Network: (1) concept; (2) CapsNet architecture; (3) dynamic routing; (4) matrix capsules with EM routing.
6. Recurrent Neural Networks (RNN): (1) structure of RNN; (2) Long
Short-term Memory (LSTM); (3) attention mechanism; (4) Word2vec; (5) other deep learning language models.
7. Deep-learning Applications: (1) computer vision; (2) language processing; (3) games; (4) finance; (5) physics and chemistry; (6) biology and medicine; (7) limitations and ethics.