### Deep learning

- Type of supervised learning problem.
- Good at capturing interactions and non-linearities.

#### Useful Concepts

- Softmax – Useful for converting model scores into probabilities
- scores * n makes the softmax outputs closer to 0 and 1.
- scores / n makes the softmax outputs resemble a uniform distribution.

- Cross Entropy
- Useful for measuring the distance between model output probabilities and one-hot encoded class labels
- D(Scores, Labels) = sum (label_i * log(score_i))

- Normalize the inputs to 0 mean and equal variance
- Sample the parameters from a normal distribution with 0 mean and a small sigma. (so that the output scores are small which in turn means that the softmax outputs resemble a uniform distribution)

## Deep Learning Fundamentals

- Learning approaches: Backpropagation
- Blog explaining by example.
- Learning representations by back-propagating errors
- This paper popularized back-propagation as a method to learn weights of a NN.

- Activation Function
**: ReLU**- Rectified Linear Units Improve Restricted Boltzmann Machines
- Proposed ReLU activation function to mitigate vanishing gradient problem.

- Rectified Linear Units Improve Restricted Boltzmann Machines
- Dropout (Approach to reduce overfitting)

## Image Recognition

**2015**- Deep Residual Learning for Image Recognition [
**ResNet**]- Allowed for training “deep” NNs (~152 layers)
- State-of-the-art performance on Imagenet data

- Deep Residual Learning for Image Recognition [
**2014**- Going deeper with convolutions [GoogleNet]
- Introduced “Inception” module

- Going deeper with convolutions [GoogleNet]

**2012**- Imagenet Classification with Deep Convolution Neural Networks [AlexNet]
- State-of-the-art performance on Imagenet dataset using CNN

- Imagenet Classification with Deep Convolution Neural Networks [AlexNet]
**1998**- Gradient Based Learning Applied to Document Recognition [LeNet]
- State-of-the-art performance on MNIST digit recognition dataset using CNNs

- Gradient Based Learning Applied to Document Recognition [LeNet]

Advertisements