14. @YourTwitterHandle#Devoxx #YourTag @samklr#devoxx #deepLearning
Multiple Definitions :
- A set of algorithms that try to model high-level abstractions in data by
using architectures composed of multiple non linear transformations.
- A set of machine learning algorithms that automatically learn features
hierarchies.
- Usually built using Neural Networks.
- Representation Learning: Automatically learning good representations
of the data for your classifier, i.e learn good features
- Deep Learning : Learning multiple levels of representation with
complex layers architectures.
22. @YourTwitterHandle#Devoxx #YourTag @samklr#Devoxx #deepLearning
Stochastic gradient descent and
Backpropagation
- The optimal value for each weight
where at error achieves a global
minimum.
- Backpropagation : Compute error in
the output, then propagate it back
through the network, to update the
weights during the training phase.
25. @YourTwitterHandle#Devoxx #YourTag @samklr#devoxx #deepLearning
Problem with Large Networks
- Vanishing Gradient : More layers kills the back propagation. As
information is passed back; gradient value startts to vanish and
become smaller compared to weights.
- Overfitting : Algorithm fits too closely the training data, but will
fail miserably on real examples
27. @YourTwitterHandle#Devoxx #YourTag @samklr#devoxx #deepLearning
Problem with Large Networks
- Vanishing Gradient : More layers kills the back propagation. As
information is passed back; gradient value startts to vanish and
become smaller compared to weights.
- Overfitting : Algorithm fits too closely the training data, but will
fail miserably on real examples
This led to another AI
winter
29. @YourTwitterHandle#Devoxx #YourTag @samklr#devoxx #deepLearning
Why does work suddenly ?
- A lot more labeled data
- More compute power CPU and GPU
- Clever new idea on how to train deep
architectures
36. @YourTwitterHandle#Devoxx #YourTag @samklr#devoxx #deepLearning
Recurrent Neural Networks (RNN)
- Learn from arbitrary sequential inputs, by using their
internal states
- Cannot look far back (back propagation limited)
- Long Short Term Memory Networks help solving this
- Good for NLP, hand writing, etc
39. @YourTwitterHandle#Devoxx #YourTag @samklr#devoxx #deepLearning
Other Techniques
- Auto Encoders
- Restricted Boltzmann Machine (RBM)
- Deep Belief
- Hierarchical temporal memory (HTM)
- LSTM
- More and more ...