Chapter 1 A: Deep learning - Evolution of DL, Pre-read fun - Stories, Personalities, and Exciting Applications
http://www.hpc.lsu.edu/training/weekly-materials/2016-Fall/machine_learning_qb2_fall_2016.pdf ( Page 1 to Page 23 )
http://cs229.stanford.edu/materials/CS229-DeepLearning.pdf
AI to AGI (Artificial General Intelligence) - https://www.mckinsey.com/business-functions/operations/our-insights/an-executive-primer-on-artificial-general-intelligence
Python IDE: Anaconda and Python 3.7 - Downlowd python 3.7 from https://www.python.org. Download individual version of Anaconda from https://www.anaconda.com/products/individual
The individual version of Anaconda IDE is free.
Follow directions on installing python 3.7 and anaconda (IDE). This provides the Jupyter notebook facility also which we may use in addition to the GPU setup within Ubuntu Linux setup. These setups are possible with the Windows OS also, but I will be using only Ubuntu for the entire course.
The key book we will be using in this "Reading and Doing Lab work Series" is Deep learning with Python, by Fracois Chollett. Francois Chollet is our guru providing extensive notes, and we are indebted for him, because copy and paste is so easy, we should not forget that most of the material I will cover are from this book to a level sometimes upto the dotting “i”s and crossing the “t”s. Buy the book, if you want to master this class, and it is a great resource. I have this book in my library. I also have his book on 'Deep Learning with R'.
https://livebook.manning.com/book/deep-learning-with-python/chapter-1/
Python jupyter notebooks from the book are available here: https://github.com/fchollet/deep-learning-with-python-notebooks
- High-level definitions of fundamental concepts
- Timeline of the development of machine learning
- Key factors behind deep learning’s rising popularity and future potential
Chapter 2: Mathematical building blocks of deep learning - The tensors, derivative of derivative of derivative of … (internal compute quantity of essense of back propagating the errors), minimizing error (loss) function, gradient descent and stochastic gradient descent
- A first example of a neural network
- Tensors and tensor operations
- How neural networks learn via backpropagation and gradient descent
The following PPT will be used for both Module 2 and Module 3
- Core components of neural networks
- An introduction to Keras
- Using neural networks to solve basic classification and regression problems http://helper.ipam.ucla.edu/publications/eltut/eltut_14764.pdf
https://www.cs.tau.ac.il/~dcor/Graphics/pdf.slides/YY-Deep%20Learning.pdf
-
Setting up a deep-learning workstation - https://youtu.be/tutZppdMU5E
-
The commands text file - https://github.com/InstituteOfAnalyticsUSA/Essential-DeepLearning-With-Python/blob/main/aws%20jupyter%20configuration%20commands.txt
-
Presentation deck - https://github.com/InstituteOfAnalyticsUSA/Essential-DeepLearning-With-Python/blob/main/Deep%20Learning%20Setup%20in%20AWS%20Environment.pdf
-
Forms of machine learning beyond classification and regression
-
Formal evaluation procedures for machine-learning models
-
Preparing data for deep learning
-
Feature engineering (It is not just variable transformation; it is deeper than that, Feature definition depends on Data Structure, Data Type, and Data Format in addition to transformation, and special context sensitive measures that help achieve the prediction using our mathematical and statistical principles, computational and algorithmic principles)
-
Tackling overfitting using the following concepts (Train/Validate/Test, Cross-Validation, and Bias-Variance Trade off)
-
The universal workflow for approaching machine-learning problems
The Presentation deck in PDF presented in the following Youtube video - https://github.com/InstituteOfAnalyticsUSA/Essential-DeepLearning-With-Python/blob/main/Machine%20Learning%20Foundations_MeetUp_17NOV2020.pptx
- Understanding convolutional neural networks (convnets)
- Using data augmentation to mitigate overfitting
- Using a pre-trained convnet to do feature extraction
- Fine-tuning a pre-trained convnet
- Visualizing what convnets learn and how they make classification decisions
- Preprocessing text data into useful representations
- Working with recurrent neural networks
- Using 1D convnets for sequence processing
- The Keras functional API
- Using Keras callbacks
- Working with the TensorBoard visualization tool
- Important best practices for developing state-of-the-art models
- Text generation with LSTM
- Implementing DeepDream
- Performing neural style transfer
- Variational autoencoders
- Understanding generative adversarial networks
http://www.hpc.lsu.edu/training/weekly-materials/2016-Fall/machine_learning_qb2_fall_2016.pdf
Boltzmann machine (BM) and Resctricted BM ( RBM)- Edwin Chen Blog (Key and it is the first reference)
http://blog.echen.me/2011/07/18/introduction-to-restricted-boltzmann-machines/ https://github.com/echen/restricted-boltzmann-machines """A Practical guide to training restricted Boltzmann machines, by Geoffrey Hinton. A talk by Andrew Ng on Unsupervised Feature Learning and Deep Learning. Restricted Boltzmann Machines for Collaborative Filtering. I found this paper hard to read, but it's an interesting application to the Netflix Prize. Geometry of the Restricted Boltzmann Machine. A very readable introduction to RBMs, "starting with the observation that its Zariski closure is a Hadamard power of the first secant variety of the Segre variety of projective lines". (I kid, I kid.)"""