2. Neuron Model
1899 - Discovered. Santiago Ramón
Neuron
Input
1
Input
2
Input
3
weight 2weight 1 weight 3
Activation Function
Output
1958 - Perceptron. Frank Rosenblatt
1984 - Backpropagation optimization
2010 - Recurrent and Deep FF nets
2012+ ….
3. ImageNet Challenge
150,000 images and 1000 object classes
top 5 suggestions, error rate, %
Deep Learning and last 10 years
- AlexNet (2012) exploded industry. 5 layers
- ZF Net (2013) - 5 layers improved AlexNet
- GoogLeNet (2015) - 22 layers
- VGG Net (2014) - Oxford 19, layers
In 2016:
- NVIDIA DGX-1 system. ~170TFlops!!!
- Intel® Xeon Phi™ 7210. ~ 3 TFlops
4. Top AI scientists
Geoffrey E. Hinton
University of Toronto,
AlexNet curator,
researcher
Andrew NgYann LeCun
Facebook,
AI research group,
working on AI
since 1998
Chief Scientist of Baidu,
Co-Founder Coursera,
Professor at Stanford University
6. Layer-wise organization
Most networks are fully-connected
Not counting Input layer (3-layered on picture)
Output layer - no activation function
4 + 4 + 1 = 9 neurons
[3 x 4] + [4 x 4] + [4 x 1]
= 12 + 16 + 4 = 32 weights
4 + 4 + 1 = 9 biases
∑ = 41 learnable params
Modern NN ~100 million parameters with ~10-20 layers
Example: Visual Geometry Group Network (Oxford) have 19 layers and 138 Millions parameters to learn
8. ReLU Rocks !
ReLUs (solid line) reaches a 25% training error rate on CIFAR-10 six times faster
than an equivalent network with tanh neurons (dashed line)
by Alex Krizhevsky
22. @ Google
tensorflow.org launched in Nov 2015.
- most popular ML library
- GitHub: 35,000 stars 15,000 forks
- 350 contributors
23. TensorFlow
- Python API
- C++ API (poorly documented)
- Java API ??? (TBA in 201X)
API
Features
- CPU or multiple CPU, GPU or multiple GPU;
- Async computation with lazy loading of execution graph;
- Many of algorithm have already implemented;
- TensorBoard: graph execution visualisation + debugging;
import tensorflow as tf
hello = tf.constant('Hello, TensorWorld!')
sess = tf.Session()
print sess.run(hello)