Deep Learning, an interactive introduction for NLP-ers

Roelof Pieters
Roelof PietersCofounder creative.ai um creative.ai
@graphific
Roelof Pieters
Introduc0on	
  to	
  

Deep	
  Learning	
  for	
  NLP
22	
  January	
  2015	
  

Stockholm	
  Natural	
  Language	
  Processing	
  Meetup
FEEDA
Slides at:

http://www.slideshare.net/roelofp/220115dlmeetup
1
Deep
Learning ???
2
A couple of headlines… [all November ’14]
3
(source: Google Trends)
4
Machine Learning ??
- Audience Check -
5
• “Brain” inspired / simulations:
• vision: make learning algorithms 

better and easier to use
• goal: revolutions in (practical) 

advances for machine learning and AI
• Deep Learning = subfield of Machine Learning
Deep Learning ??
6
Biological Inspiration
7
Deep Learning ??
8
DL: Impact
9
Speech Recognition
DL: Impact
10
Deep Learning for the win!
a few examples:
• IJCNN 2011 Traffic Sign Recognition Competition
• ISBI 2012 Segmentation of neuronal structures in EM stacks
challenge
• ICDAR 2011 Chinese handwriting recognition
• Deals with “construction and study of systems that can
learn from data”
Machine Learning ??
A computer program is said to learn from
experience (E) with respect to some class
of tasks (T) and performance measure (P),
if its performance at tasks in T, as measured
by P, improves with experience E
— T. Mitchell 1997
11
Machine Learning ??
Traditional Programming:
Data
Program
Output
Data
Program
Output
Machine Learning:
12
Supervised (inductive) learning
• Training data includes desired outputs
Unsupervised learning
• Training data does not include desired outputs
Semi-supervised learning
• Training data includes a few desired outputs
Reinforcement learning
• Rewards from sequence of actions
Types of Learning
13
ML: Traditional Approach
1. Gather as much LABELED data as you can get
2. Throw some algorithms at it (mainly put in an SVM and
keep it at that)
3. If you actually have tried more algos: Pick the best
4. Spend hours hand engineering some features / feature
selection / dimensionality reduction (PCA, SVD, etc)
5. Repeat…
For each new problem/question::
14
Machine Learning for NLP
Data
Classic Approach: Data is fed into a learning algorithm:
Learning 

Algorithm
15
Machine Learning for NLP
some of the (many) treebank datasets
source: http://www-nlp.stanford.edu/links/statnlp.html#Treebanks
!
16
Penn Treebank
That’s a lot of “manual” work:
17
• the students went to class
DT NN VB P NN
• plays well with others
VB ADV P NN
NN NN P DT
• fruit flies like a banana
NN NN VB DT NN
NN VB P DT NN
NN NN P DT NN
NN VB VB DT NN
With a lot of issues:
Penn Treebank
18
Machine Learning for NLP
Learning 

Algorithm
Data
“Features”
Prediction
Prediction/

Classifier
train set
test set
19
Machine Learning for NLP
Learning 

Algorithm
“Features”
Prediction
Prediction/

Classifier
train set
test set
20
Machine Learning for NLP
• Until the early 1990’s, NLP systems were built manually
with hand-crafted dictionaries and rules.
• As large electronic text corpora became increasingly
available, researchers began using machine learning
techniques to automatically build NLP systems.
• Today, the vast majority of NLP systems use machine
learning.
21
2. Neural Networks

and a short history lesson
22
Perceptron (1957)
Frank Rosenblatt 

(1928-1971)
Original Perceptron
Simplified model:
(From Perceptrons by M. L Minsky and S. Papert,
1969, Cambridge, MA: MIT Press. Copyright 1969
by MIT Press.
23
Perceptron (1957)
Perceptron Research, youtube clip: 

https://www.youtube.com/watch?v=cNxadbrN_aI&feature=youtu.be&t=12
24
Perceptron (1957)
25
or
Multilayer Perceptron (1986)
inputs
weights
bias
activation
26
Neuron Model
All you need to know:
27
Activation functions
28
Backpropagation (1974/1986)
1974 Paul Werbos’ invents Backpropagation algorithm for NN
1986 Backdrop popularized by Rumelhart, Hinton, Williams
1990: Renewed Interest in NN’s
29
Backprop Renaissance
Forward Propagation
• Sum inputs, produce activation, feed-forward
30
Backprop Renaissance
Back Propagation (of error)
• Calculate total error at the top
• Calculate contributions to error at each step going
backwards
31
• Compute gradient of example-wise loss wrt
parameters
• Simply applying the derivative chain rule wisely 





• If computing the loss (example, parameters) is O(n)
computation, then so is computing the gradient
Backpropagation
32
Simple Chain Rule
33
Training procedure
• Initialize randomly
• Sequentially give it data.
• See what the difference is between network output
and actual output.
• Update the weights according to this error.
• End result: give a model input, and it produces a
proper output.
Quest for the weights. The weights are the model!
To reiterate:
34
So why only now?
• Inspired by the architectural depth of the brain,
researchers wanted for decades to train deep
multi-layer neural networks.
• No successful attempts were reported before 2006
…Exception: convolutional neural networks,
LeCun 1998
• SVM: Vapnik and his co-workers developed the
Support Vector Machine (1993) (shallow
architecture).
• Breakthrough in 2006!
35
2006 Breakthrough
• More data
• Faster hardware: GPU’s, multi-core CPU’s
• Working ideas on how to train deep architectures
36
2006 Breakthrough
• More data
• Faster hardware: GPU’s, multi-core CPU’s
• Working ideas on how to train deep architectures
37
2006 Breakthrough
38
2006 Breakthrough
• More data
• Faster hardware: GPU’s, multi-core CPU’s
• Working ideas on how to train deep architectures
39
2006 Breakthrough
40
2006 Breakthrough
• More data
• Faster hardware: GPU’s, multi-core CPU’s
• Working ideas on how to train deep
architectures
41
2006 Breakthrough
Stacked Restricted Boltzman Machines* (RBM)
Hinton, G. E, Osindero, S., and Teh, Y. W. (2006).

A fast learning algorithm for deep belief nets.

Neural Computation, 18:1527-1554.
Stacked Autoencoders (AE)
Bengio, Y., Lamblin, P., Popovici, P., Larochelle, H. (2007).

Greedy Layer-Wise Training of Deep Networks,

Advances in Neural Information Processing Systems 19
* called Deep Belief Networks (DBN)
42
3. Deep Learning

onwards we go…
43
44
Hierarchies
Efficient
Generalization
Distributed
Sharing
Unsupervised*
Black Box
Training Time
Major PWNAGE!
Much Data
Why go Deep ?
45
No More Handcrafted Features !
46
— Andrew Ng
“I’ve worked all my life in
Machine Learning, and I’ve
never seen one algorithm knock
over benchmarks like Deep
Learning”
Deep Learning: Why?
47
Biological Justification
Deep Learning = Brain “inspired”

Audio/Visual Cortex has multiple stages == Hierarchical
• Computational Biology • CVAP
• Jorge Dávila-Chacón
• “that guy”
“Brainiacs” “Pragmatists”vs
48
Different Levels of Abstraction
49
Hierarchical Learning
• Natural progression
from low level to high
level structure as seen
in natural complexity
Different Levels of Abstraction
Feature Representation
50
Hierarchical Learning
• Natural progression
from low level to high
level structure as seen
in natural complexity
• Easier to monitor what
is being learnt and to
guide the machine to
better subspaces
Different Levels of Abstraction
Feature Representation
51
Hierarchical Learning
• Natural progression
from low level to high
level structure as seen
in natural complexity
• Easier to monitor what
is being learnt and to
guide the machine to
better subspaces
• A good lower level
representation can be
used for many distinct
tasks
Different Levels of Abstraction
Feature Representation
52
Hierarchical Learning
• Natural progression
from low level to high
level structure as seen
in natural complexity
• Easier to monitor what
is being learnt and to
guide the machine to
better subspaces
• A good lower level
representation can be
used for many distinct
tasks
Different Levels of Abstraction
Feature Representation
53
• Shared Low Level
Representations
• Multi-Task Learning
• Unsupervised Training
Generalizable Learning
54
• Shared Low Level
Representations
• Multi-Task Learning
• Unsupervised Training
• Partial Feature Sharing
• Mixed Mode Learning
• Composition of
Functions
Generalizable Learning
55
Classic Deep Architecture
Input layer
Hidden layers
Output layer
56
Modern Deep Architecture
Input layer
Hidden layers
Output layer
57
Deep Learning: Why? (again)
Beat state of the art in many areas:
• Language Modeling (2012, Mikolov et al)
• Image Recognition (Krizhevsky won
2012 ImageNet competition)
• Sentiment Classification (2011, Socher et
al)
• Speech Recognition (2010, Dahl et al)
• MNIST hand-written digit recognition
(Ciresan et al, 2010)
58
One Model rules them all ?



DL approaches have been successfully applied to:
Deep Learning: Why for NLP ?
Automatic summarization Coreference resolution Discourse analysis
Machine translation Morphological segmentation Named entity recognition (NER)
Natural language generation
Natural language understanding
Optical character recognition (OCR)
Part-of-speech tagging
Parsing
Question answering
Relationship extraction
sentence boundary disambiguation
Sentiment analysis
Speech recognition
Speech segmentation
Topic segmentation and recognition
Word segmentation
Word sense disambiguation
Information retrieval (IR)
Information extraction (IE)
Speech processing
59
- COFFEE BREAK -
after the break we return with: CODE
Download the code samples already now from:
https://github.com/graphific/DL-Meetup-intro
http://goo.gl/abX1E2shortened url: 
 60
• Deep Neural Network
• Multilayer Perceptron (MLP) or Artificial Neural
Network (ANN)
1. MLP
Logistic regression
Training regime: 

Stochastic Gradient Descent (SGD) with minibatches
MNIST dataset
Simple hidden layer
61
2. Convolutional Neural Network
62
from: Krizhevsky, Sutskever, Hinton. (2012). ImageNet Classification with Deep Convolutional Neural Networks
[breakthrough in object recognition, Imagenet 2012]
Convolutional Neural Network
http://ufldl.stanford.edu/wiki/index.php/
Feature_extraction_using_convolution
movie time:
http://www.cs.toronto.edu/~hinton/adi/index.htm
63
Thats it, no more code! (for now)
64
Deep Learning: Future Developments
Currently an explosion of developments
• Hessian-Free networks (2010)
• Long Short Term Memory (2011)
• Large Convolutional nets, max-pooling (2011)
• Nesterov’s Gradient Descent (2013)
Currently state of the art but...
• No way of doing logical inference (extrapolation)
• No easy integration of abstract knowledge
• Hypothetic space bias might not conform with reality
65
Deep Learning: Future Challenges
a
66
Szegedy, C., Wojciech, Z., Sutskever, I., Bruna, J., Erhan, D., Goodfellow, I., Fergus, R. (2013) Intriguing
properties of neural networks
L: correctly identified, Center: added noise x10, R: “Ostrich”
• cuda-convnet2 (Alex Krizhevsky, Toronto) (c++/
CUDA, optimized for GTX 580) 

https://code.google.com/p/cuda-convnet2/
• Caffe (Berkeley) (Cuda/OpenCL, Theano, Python)

http://caffe.berkeleyvision.org/
• OverFeat (NYU) 

http://cilvr.nyu.edu/doku.php?id=code:start
Wanna Play ?
• Theano - CPU/GPU symbolic expression compiler in
python (from LISA lab at University of Montreal). http://
deeplearning.net/software/theano/
• Pylearn2 - library designed to make machine learning
research easy. http://deeplearning.net/software/pylearn2/
• Torch - Matlab-like environment for state-of-the-art
machine learning algorithms in lua (from Ronan Collobert,
Clement Farabet and Koray Kavukcuoglu) http://torch.ch/
• more info: http://deeplearning.net/software links/
Wanna Play ?
Wanna Play ?
as PhD candidate KTH/CSC:
“Always interested in discussing
Machine Learning, Deep
Architectures, Graphs, and
Language Technology”
In touch!
roelof@kth.se
www.csc.kth.se/~roelof/
Internship / EntrepeneurshipAcademic/Research
as CIO/CTO Feeda:
“Always looking for additions to our 

brand new R&D team”



[Internships upcoming on 

KTH exjobb website…]
roelof@feeda.com
www.feeda.com
Feeda
69
Were Hiring!
roelof@feeda.com
www.feeda.com
Feeda
• Dev Ops
• Software Developers
• Data Scientists
70
Thanks for listening
Mingling time!
71
72
Can’t get enough?
Come to my talk Tomorrow (friday)
Description on KTH website
Visual-Semantic Embeddings: 

some thoughts on Language
Roelof Pieters TCS/CSC
Friday jan 23 13:30.
Room 304, Teknikringen 14 level 3
Appendum
Some of the exciting recent developments in NLP

especially Distributed Semantics
73
Word Embeddings: Turian (2010)
Turian, J., Ratinov, L., Bengio, Y. (2010). Word representations: A simple and general method for semi-supervised learning
code & info: http://metaoptimize.com/projects/wordreprs/74
Word Embeddings: Turian (2010)
Turian, J., Ratinov, L., Bengio, Y. (2010). Word representations: A simple and general method for semi-supervised learning
code & info: http://metaoptimize.com/projects/wordreprs/75
Word Embeddings: Collobert & Weston (2011)
Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P. (2011) .
Natural Language Processing (almost) from Scratch
76
Multi-embeddings: Stanford (2012)
Eric H. Huang, Richard Socher, Christopher D. Manning, Andrew Y. Ng 

Improving Word Representations via Global Context and Multiple Word Prototypes
77
Linguistic Regularities: Mikolov (2013)
code & info: https://code.google.com/p/word2vec/
Mikolov, T., Yih, W., & Zweig, G. (2013). Linguistic Regularities in Continuous Space Word Representations
78
Word Embeddings for MT: Mikolov (2013)
Mikolov, T., Le, V. L., Sutskever, I. (2013) . Exploiting Similarities among Languages for Machine Translation
79
Recursive Deep Models & Sentiment: Socher (2013)
Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Chris Manning, Andrew Ng and Chris Potts. 2013. Recursive
Deep Models for Semantic Compositionality Over a Sentiment Treebank. EMNLP 2013
code & demo: http://nlp.stanford.edu/sentiment/index.html80
Paragraph Vectors: Le & Mikolov (2014)
Le, Q., Mikolov,. T. (2014) Distributed Representations of Sentences and Documents
81
• add context (sentence, paragraph, document) to word
vectors during training
!
Results on Stanford Sentiment 

Treebank dataset:
Global Vectors, GloVe: Stanford (2014)
Pennington, P., Socher, R., Manning,. D.M. (2014). GloVe: Global Vectors for Word Representation
code & demo: http://nlp.stanford.edu/projects/glove/
vs
results on the word analogy task
“similar accuracy”
82
Dependency-based Embeddings: Levy & Goldberg (2014)
Levy, O., Goldberg, Y. (2014). Dependency-Based Word Embeddings
code & demo: https://levyomer.wordpress.com/2014/04/25/
dependency-based-word-embeddings/
- Syntactic Dependency Context
Australian scientist discovers star with telescope
- Bag of Words (BoW) Context
0.3$
0.4$
0.5$
0.6$
0.7$
0.8$
0.9$
1$
0$ 0.1$ 0.2$ 0.3$ 0.4$ 0.5$ 0.6$ 0.7$ 0.8$ 0.9$ 1$
Precision$
Recall$
“Dependency-based
embeddings have more
functional
similarities”
83
1 von 83

Recomendados

What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori... von
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...Simplilearn
3.6K views54 Folien
Deep learning von
Deep learningDeep learning
Deep learningHatim EL-QADDOURY
348 views12 Folien
Notes on attention mechanism von
Notes on attention mechanismNotes on attention mechanism
Notes on attention mechanismKhang Pham
2K views18 Folien
Notes from Coursera Deep Learning courses by Andrew Ng von
Notes from Coursera Deep Learning courses by Andrew NgNotes from Coursera Deep Learning courses by Andrew Ng
Notes from Coursera Deep Learning courses by Andrew NgdataHacker. rs
6.6K views107 Folien
Rnn and lstm von
Rnn and lstmRnn and lstm
Rnn and lstmShreshth Saxena
1.1K views15 Folien
An introduction to Deep Learning von
An introduction to Deep LearningAn introduction to Deep Learning
An introduction to Deep LearningJulien SIMON
4.3K views31 Folien

Más contenido relacionado

Was ist angesagt?

Deep learning - A Visual Introduction von
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual IntroductionLukas Masuch
57.5K views53 Folien
Introduction to Transformers for NLP - Olga Petrova von
Introduction to Transformers for NLP - Olga PetrovaIntroduction to Transformers for NLP - Olga Petrova
Introduction to Transformers for NLP - Olga PetrovaAlexey Grigorev
403 views97 Folien
Intro to deep learning von
Intro to deep learning Intro to deep learning
Intro to deep learning David Voyles
1.7K views17 Folien
Natural language processing and transformer models von
Natural language processing and transformer modelsNatural language processing and transformer models
Natural language processing and transformer modelsDing Li
658 views31 Folien
Natural Language Processing (NLP) von
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)Yuriy Guts
26K views61 Folien
Bert von
BertBert
BertAbdallah Bashir
6.4K views77 Folien

Was ist angesagt?(20)

Deep learning - A Visual Introduction von Lukas Masuch
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual Introduction
Lukas Masuch57.5K views
Introduction to Transformers for NLP - Olga Petrova von Alexey Grigorev
Introduction to Transformers for NLP - Olga PetrovaIntroduction to Transformers for NLP - Olga Petrova
Introduction to Transformers for NLP - Olga Petrova
Alexey Grigorev403 views
Intro to deep learning von David Voyles
Intro to deep learning Intro to deep learning
Intro to deep learning
David Voyles1.7K views
Natural language processing and transformer models von Ding Li
Natural language processing and transformer modelsNatural language processing and transformer models
Natural language processing and transformer models
Ding Li658 views
Natural Language Processing (NLP) von Yuriy Guts
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)
Yuriy Guts26K views
What is Deep Learning | Deep Learning Simplified | Deep Learning Tutorial | E... von Edureka!
What is Deep Learning | Deep Learning Simplified | Deep Learning Tutorial | E...What is Deep Learning | Deep Learning Simplified | Deep Learning Tutorial | E...
What is Deep Learning | Deep Learning Simplified | Deep Learning Tutorial | E...
Edureka!4.7K views
Introduction to deep learning von Amr Rashed
Introduction to deep learningIntroduction to deep learning
Introduction to deep learning
Amr Rashed370 views
Natural Language Processing (NLP) & Text Mining Tutorial Using NLTK | NLP Tra... von Edureka!
Natural Language Processing (NLP) & Text Mining Tutorial Using NLTK | NLP Tra...Natural Language Processing (NLP) & Text Mining Tutorial Using NLTK | NLP Tra...
Natural Language Processing (NLP) & Text Mining Tutorial Using NLTK | NLP Tra...
Edureka!2.7K views
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural... von Simplilearn
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Simplilearn4.1K views
Deep learning von Rajgupta258
Deep learning Deep learning
Deep learning
Rajgupta2581.7K views
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ... von Simplilearn
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Simplilearn876 views
Deep Learning for Natural Language Processing von Devashish Shanker
Deep Learning for Natural Language ProcessingDeep Learning for Natural Language Processing
Deep Learning for Natural Language Processing
Devashish Shanker32K views
Deep Learning Tutorial von Amr Rashed
Deep Learning TutorialDeep Learning Tutorial
Deep Learning Tutorial
Amr Rashed5.4K views
Deep Learning: Application & Opportunity von iTrain
Deep Learning: Application & OpportunityDeep Learning: Application & Opportunity
Deep Learning: Application & Opportunity
iTrain1.6K views
ViT (Vision Transformer) Review [CDM] von Dongmin Choi
ViT (Vision Transformer) Review [CDM]ViT (Vision Transformer) Review [CDM]
ViT (Vision Transformer) Review [CDM]
Dongmin Choi4.2K views
Deep Learning Explained von Melanie Swan
Deep Learning ExplainedDeep Learning Explained
Deep Learning Explained
Melanie Swan41.4K views

Destacado

Construisons ensemble le chatbot bancaire dedemain ! von
Construisons ensemble le chatbot bancaire dedemain !Construisons ensemble le chatbot bancaire dedemain !
Construisons ensemble le chatbot bancaire dedemain !LINAGORA
732 views60 Folien
Hackathon 2014 NLP Hack von
Hackathon 2014 NLP HackHackathon 2014 NLP Hack
Hackathon 2014 NLP HackRoelof Pieters
1.2K views8 Folien
iPhone5c的最后猜测 von
iPhone5c的最后猜测iPhone5c的最后猜测
iPhone5c的最后猜测Yanbin Kong
2.1K views30 Folien
Visual-Semantic Embeddings: some thoughts on Language von
Visual-Semantic Embeddings: some thoughts on LanguageVisual-Semantic Embeddings: some thoughts on Language
Visual-Semantic Embeddings: some thoughts on LanguageRoelof Pieters
4.5K views84 Folien
How we sleep well at night using Hystrix at Finn.no von
How we sleep well at night using Hystrix at Finn.noHow we sleep well at night using Hystrix at Finn.no
How we sleep well at night using Hystrix at Finn.noHenning Spjelkavik
5.7K views96 Folien
Deep Learning in practice : Speech recognition and beyond - Meetup von
Deep Learning in practice : Speech recognition and beyond - MeetupDeep Learning in practice : Speech recognition and beyond - Meetup
Deep Learning in practice : Speech recognition and beyond - MeetupLINAGORA
6.8K views56 Folien

Destacado(20)

Construisons ensemble le chatbot bancaire dedemain ! von LINAGORA
Construisons ensemble le chatbot bancaire dedemain !Construisons ensemble le chatbot bancaire dedemain !
Construisons ensemble le chatbot bancaire dedemain !
LINAGORA732 views
iPhone5c的最后猜测 von Yanbin Kong
iPhone5c的最后猜测iPhone5c的最后猜测
iPhone5c的最后猜测
Yanbin Kong2.1K views
Visual-Semantic Embeddings: some thoughts on Language von Roelof Pieters
Visual-Semantic Embeddings: some thoughts on LanguageVisual-Semantic Embeddings: some thoughts on Language
Visual-Semantic Embeddings: some thoughts on Language
Roelof Pieters4.5K views
How we sleep well at night using Hystrix at Finn.no von Henning Spjelkavik
How we sleep well at night using Hystrix at Finn.noHow we sleep well at night using Hystrix at Finn.no
How we sleep well at night using Hystrix at Finn.no
Henning Spjelkavik5.7K views
Deep Learning in practice : Speech recognition and beyond - Meetup von LINAGORA
Deep Learning in practice : Speech recognition and beyond - MeetupDeep Learning in practice : Speech recognition and beyond - Meetup
Deep Learning in practice : Speech recognition and beyond - Meetup
LINAGORA6.8K views
Blockchain Economic Theory von Melanie Swan
Blockchain Economic TheoryBlockchain Economic Theory
Blockchain Economic Theory
Melanie Swan3.8K views
Blockchain Smartnetworks: Bitcoin and Blockchain Explained von Melanie Swan
Blockchain Smartnetworks: Bitcoin and Blockchain ExplainedBlockchain Smartnetworks: Bitcoin and Blockchain Explained
Blockchain Smartnetworks: Bitcoin and Blockchain Explained
Melanie Swan4.2K views
Deep Learning for Chatbot (4/4) von Jaemin Cho
Deep Learning for Chatbot (4/4)Deep Learning for Chatbot (4/4)
Deep Learning for Chatbot (4/4)
Jaemin Cho2.6K views
Cs231n 2017 lecture12 Visualizing and Understanding von Yanbin Kong
Cs231n 2017 lecture12 Visualizing and UnderstandingCs231n 2017 lecture12 Visualizing and Understanding
Cs231n 2017 lecture12 Visualizing and Understanding
Yanbin Kong1.1K views
Deep Learning for Chatbot (3/4) von Jaemin Cho
Deep Learning for Chatbot (3/4)Deep Learning for Chatbot (3/4)
Deep Learning for Chatbot (3/4)
Jaemin Cho3K views
Cs231n 2017 lecture13 Generative Model von Yanbin Kong
Cs231n 2017 lecture13 Generative ModelCs231n 2017 lecture13 Generative Model
Cs231n 2017 lecture13 Generative Model
Yanbin Kong1.1K views
Neural Models for Document Ranking von Bhaskar Mitra
Neural Models for Document RankingNeural Models for Document Ranking
Neural Models for Document Ranking
Bhaskar Mitra1.8K views
Cs231n 2017 lecture11 Detection and Segmentation von Yanbin Kong
Cs231n 2017 lecture11 Detection and SegmentationCs231n 2017 lecture11 Detection and Segmentation
Cs231n 2017 lecture11 Detection and Segmentation
Yanbin Kong813 views
Deep Learning & NLP: Graphs to the Rescue! von Roelof Pieters
Deep Learning & NLP: Graphs to the Rescue!Deep Learning & NLP: Graphs to the Rescue!
Deep Learning & NLP: Graphs to the Rescue!
Roelof Pieters20.2K views

Similar a Deep Learning, an interactive introduction for NLP-ers

Deep Learning: a birds eye view von
Deep Learning: a birds eye viewDeep Learning: a birds eye view
Deep Learning: a birds eye viewRoelof Pieters
8.3K views75 Folien
Beyond the Symbols: A 30-minute Overview of NLP von
Beyond the Symbols: A 30-minute Overview of NLPBeyond the Symbols: A 30-minute Overview of NLP
Beyond the Symbols: A 30-minute Overview of NLPMENGSAYLOEM1
155 views41 Folien
MILA DL & RL summer school highlights von
MILA DL & RL summer school highlights MILA DL & RL summer school highlights
MILA DL & RL summer school highlights Natalia Díaz Rodríguez
2.6K views64 Folien
OWF14 - Big Data : The State of Machine Learning in 2014 von
OWF14 - Big Data : The State of Machine  Learning in 2014OWF14 - Big Data : The State of Machine  Learning in 2014
OWF14 - Big Data : The State of Machine Learning in 2014Paris Open Source Summit
2.1K views62 Folien
BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2 von
BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2
BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2Karthik Murugesan
523 views273 Folien
Deep Learning for NLP: An Introduction to Neural Word Embeddings von
Deep Learning for NLP: An Introduction to Neural Word EmbeddingsDeep Learning for NLP: An Introduction to Neural Word Embeddings
Deep Learning for NLP: An Introduction to Neural Word EmbeddingsRoelof Pieters
20.1K views75 Folien

Similar a Deep Learning, an interactive introduction for NLP-ers(20)

Deep Learning: a birds eye view von Roelof Pieters
Deep Learning: a birds eye viewDeep Learning: a birds eye view
Deep Learning: a birds eye view
Roelof Pieters8.3K views
Beyond the Symbols: A 30-minute Overview of NLP von MENGSAYLOEM1
Beyond the Symbols: A 30-minute Overview of NLPBeyond the Symbols: A 30-minute Overview of NLP
Beyond the Symbols: A 30-minute Overview of NLP
MENGSAYLOEM1155 views
BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2 von Karthik Murugesan
BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2
BIng NLP Expert - Dl summer-school-2017.-jianfeng-gao.v2
Karthik Murugesan523 views
Deep Learning for NLP: An Introduction to Neural Word Embeddings von Roelof Pieters
Deep Learning for NLP: An Introduction to Neural Word EmbeddingsDeep Learning for NLP: An Introduction to Neural Word Embeddings
Deep Learning for NLP: An Introduction to Neural Word Embeddings
Roelof Pieters20.1K views
DEF CON 24 - Clarence Chio - machine duping 101 von Felipe Prado
DEF CON 24 - Clarence Chio - machine duping 101DEF CON 24 - Clarence Chio - machine duping 101
DEF CON 24 - Clarence Chio - machine duping 101
Felipe Prado70 views
Promises of Deep Learning von David Khosid
Promises of Deep LearningPromises of Deep Learning
Promises of Deep Learning
David Khosid2K views
"Large-Scale Deep Learning for Building Intelligent Computer Systems," a Keyn... von Edge AI and Vision Alliance
"Large-Scale Deep Learning for Building Intelligent Computer Systems," a Keyn..."Large-Scale Deep Learning for Building Intelligent Computer Systems," a Keyn...
"Large-Scale Deep Learning for Building Intelligent Computer Systems," a Keyn...
Deep learning: the future of recommendations von Balázs Hidasi
Deep learning: the future of recommendationsDeep learning: the future of recommendations
Deep learning: the future of recommendations
Balázs Hidasi15.1K views
Deep Learning @ ZHAW Datalab (with Mark Cieliebak & Yves Pauchard) von Thilo Stadelmann
Deep Learning @ ZHAW Datalab (with Mark Cieliebak & Yves Pauchard)Deep Learning @ ZHAW Datalab (with Mark Cieliebak & Yves Pauchard)
Deep Learning @ ZHAW Datalab (with Mark Cieliebak & Yves Pauchard)
Thilo Stadelmann1.4K views
Transfer Learning: Breve introducción a modelos pre-entrenados. von Fernando Constantino
Transfer Learning: Breve introducción a modelos pre-entrenados.Transfer Learning: Breve introducción a modelos pre-entrenados.
Transfer Learning: Breve introducción a modelos pre-entrenados.
Deep Learning Made Easy with Deep Features von Turi, Inc.
Deep Learning Made Easy with Deep FeaturesDeep Learning Made Easy with Deep Features
Deep Learning Made Easy with Deep Features
Turi, Inc.2.2K views
Multi-modal Neural Machine Translation - Iacer Calixto von Sebastian Ruder
Multi-modal Neural Machine Translation - Iacer CalixtoMulti-modal Neural Machine Translation - Iacer Calixto
Multi-modal Neural Machine Translation - Iacer Calixto
Sebastian Ruder506 views
Week1- Introduction.pptx von fahmi324663
Week1- Introduction.pptxWeek1- Introduction.pptx
Week1- Introduction.pptx
fahmi3246635 views

Más de Roelof Pieters

Speculations in anthropology and tech for an uncertain future von
Speculations in anthropology and tech for an uncertain futureSpeculations in anthropology and tech for an uncertain future
Speculations in anthropology and tech for an uncertain futureRoelof Pieters
541 views146 Folien
AI assisted creativity von
AI assisted creativity AI assisted creativity
AI assisted creativity Roelof Pieters
1.3K views72 Folien
Creativity and AI: 
Deep Neural Nets "Going Wild" von
Creativity and AI: 
Deep Neural Nets "Going Wild"Creativity and AI: 
Deep Neural Nets "Going Wild"
Creativity and AI: 
Deep Neural Nets "Going Wild"Roelof Pieters
2.3K views77 Folien
Deep Neural Networks 
that talk (Back)… with style von
Deep Neural Networks 
that talk (Back)… with styleDeep Neural Networks 
that talk (Back)… with style
Deep Neural Networks 
that talk (Back)… with styleRoelof Pieters
2.4K views52 Folien
Building a Deep Learning (Dream) Machine von
Building a Deep Learning (Dream) MachineBuilding a Deep Learning (Dream) Machine
Building a Deep Learning (Dream) MachineRoelof Pieters
2.9K views46 Folien
Multi-modal embeddings: from discriminative to generative models and creative ai von
Multi-modal embeddings: from discriminative to generative models and creative aiMulti-modal embeddings: from discriminative to generative models and creative ai
Multi-modal embeddings: from discriminative to generative models and creative aiRoelof Pieters
2.3K views106 Folien

Más de Roelof Pieters(18)

Speculations in anthropology and tech for an uncertain future von Roelof Pieters
Speculations in anthropology and tech for an uncertain futureSpeculations in anthropology and tech for an uncertain future
Speculations in anthropology and tech for an uncertain future
Roelof Pieters541 views
Creativity and AI: 
Deep Neural Nets "Going Wild" von Roelof Pieters
Creativity and AI: 
Deep Neural Nets "Going Wild"Creativity and AI: 
Deep Neural Nets "Going Wild"
Creativity and AI: 
Deep Neural Nets "Going Wild"
Roelof Pieters2.3K views
Deep Neural Networks 
that talk (Back)… with style von Roelof Pieters
Deep Neural Networks 
that talk (Back)… with styleDeep Neural Networks 
that talk (Back)… with style
Deep Neural Networks 
that talk (Back)… with style
Roelof Pieters2.4K views
Building a Deep Learning (Dream) Machine von Roelof Pieters
Building a Deep Learning (Dream) MachineBuilding a Deep Learning (Dream) Machine
Building a Deep Learning (Dream) Machine
Roelof Pieters2.9K views
Multi-modal embeddings: from discriminative to generative models and creative ai von Roelof Pieters
Multi-modal embeddings: from discriminative to generative models and creative aiMulti-modal embeddings: from discriminative to generative models and creative ai
Multi-modal embeddings: from discriminative to generative models and creative ai
Roelof Pieters2.3K views
Deep learning for natural language embeddings von Roelof Pieters
Deep learning for natural language embeddingsDeep learning for natural language embeddings
Deep learning for natural language embeddings
Roelof Pieters1.9K views
Multi modal retrieval and generation with deep distributed models von Roelof Pieters
Multi modal retrieval and generation with deep distributed modelsMulti modal retrieval and generation with deep distributed models
Multi modal retrieval and generation with deep distributed models
Roelof Pieters1.6K views
Deep Learning for Natural Language Processing: Word Embeddings von Roelof Pieters
Deep Learning for Natural Language Processing: Word EmbeddingsDeep Learning for Natural Language Processing: Word Embeddings
Deep Learning for Natural Language Processing: Word Embeddings
Roelof Pieters9.9K views
Creative AI & multimodality: looking ahead von Roelof Pieters
Creative AI & multimodality: looking aheadCreative AI & multimodality: looking ahead
Creative AI & multimodality: looking ahead
Roelof Pieters3K views
Python for Image Understanding: Deep Learning with Convolutional Neural Nets von Roelof Pieters
Python for Image Understanding: Deep Learning with Convolutional Neural NetsPython for Image Understanding: Deep Learning with Convolutional Neural Nets
Python for Image Understanding: Deep Learning with Convolutional Neural Nets
Roelof Pieters90.5K views
Explore Data: Data Science + Visualization von Roelof Pieters
Explore Data: Data Science + VisualizationExplore Data: Data Science + Visualization
Explore Data: Data Science + Visualization
Roelof Pieters4.2K views
Deep Learning as a Cat/Dog Detector von Roelof Pieters
Deep Learning as a Cat/Dog DetectorDeep Learning as a Cat/Dog Detector
Deep Learning as a Cat/Dog Detector
Roelof Pieters36.4K views
Deep Learning for Information Retrieval von Roelof Pieters
Deep Learning for Information RetrievalDeep Learning for Information Retrieval
Deep Learning for Information Retrieval
Roelof Pieters10.5K views
Graph, Data-science, and Deep Learning von Roelof Pieters
Graph, Data-science, and Deep LearningGraph, Data-science, and Deep Learning
Graph, Data-science, and Deep Learning
Roelof Pieters2K views
Learning to understand phrases by embedding the dictionary von Roelof Pieters
Learning to understand phrases by embedding the dictionaryLearning to understand phrases by embedding the dictionary
Learning to understand phrases by embedding the dictionary
Roelof Pieters1.9K views
Zero shot learning through cross-modal transfer von Roelof Pieters
Zero shot learning through cross-modal transferZero shot learning through cross-modal transfer
Zero shot learning through cross-modal transfer
Roelof Pieters9.9K views
Recommender Systems, Matrices and Graphs von Roelof Pieters
Recommender Systems, Matrices and GraphsRecommender Systems, Matrices and Graphs
Recommender Systems, Matrices and Graphs
Roelof Pieters6.7K views

Último

Amine el bouzalimi von
Amine el bouzalimiAmine el bouzalimi
Amine el bouzalimiAmine EL BOUZALIMI
6 views38 Folien
hamro digital logics.pptx von
hamro digital logics.pptxhamro digital logics.pptx
hamro digital logics.pptxtupeshghimire
11 views36 Folien
ARNAB12.pdf von
ARNAB12.pdfARNAB12.pdf
ARNAB12.pdfArnabChakraborty499766
5 views83 Folien
ATPMOUSE_융합2조.pptx von
ATPMOUSE_융합2조.pptxATPMOUSE_융합2조.pptx
ATPMOUSE_융합2조.pptxkts120898
35 views70 Folien
Penetration Testing for Cybersecurity Professionals von
Penetration Testing for Cybersecurity ProfessionalsPenetration Testing for Cybersecurity Professionals
Penetration Testing for Cybersecurity Professionals211 Check
49 views17 Folien
40th TWNIC Open Policy Meeting: A quick look at QUIC von
40th TWNIC Open Policy Meeting: A quick look at QUIC40th TWNIC Open Policy Meeting: A quick look at QUIC
40th TWNIC Open Policy Meeting: A quick look at QUICAPNIC
109 views20 Folien

Último(13)

ATPMOUSE_융합2조.pptx von kts120898
ATPMOUSE_융합2조.pptxATPMOUSE_융합2조.pptx
ATPMOUSE_융합2조.pptx
kts12089835 views
Penetration Testing for Cybersecurity Professionals von 211 Check
Penetration Testing for Cybersecurity ProfessionalsPenetration Testing for Cybersecurity Professionals
Penetration Testing for Cybersecurity Professionals
211 Check49 views
40th TWNIC Open Policy Meeting: A quick look at QUIC von APNIC
40th TWNIC Open Policy Meeting: A quick look at QUIC40th TWNIC Open Policy Meeting: A quick look at QUIC
40th TWNIC Open Policy Meeting: A quick look at QUIC
APNIC109 views
WITS Deck von W.I.T.S.
WITS DeckWITS Deck
WITS Deck
W.I.T.S.36 views
Cracking the Code Decoding Leased Line Quotes for Connectivity Excellence.pptx von LeasedLinesQuote
Cracking the Code Decoding Leased Line Quotes for Connectivity Excellence.pptxCracking the Code Decoding Leased Line Quotes for Connectivity Excellence.pptx
Cracking the Code Decoding Leased Line Quotes for Connectivity Excellence.pptx
The Dark Web : Hidden Services von Anshu Singh
The Dark Web : Hidden ServicesThe Dark Web : Hidden Services
The Dark Web : Hidden Services
Anshu Singh22 views
40th TWNIC OPM: On LEOs (Low Earth Orbits) and Starlink Download von APNIC
40th TWNIC OPM: On LEOs (Low Earth Orbits) and Starlink Download40th TWNIC OPM: On LEOs (Low Earth Orbits) and Starlink Download
40th TWNIC OPM: On LEOs (Low Earth Orbits) and Starlink Download
APNIC112 views
40th TWNIC Open Policy Meeting: APNIC PDP update von APNIC
40th TWNIC Open Policy Meeting: APNIC PDP update40th TWNIC Open Policy Meeting: APNIC PDP update
40th TWNIC Open Policy Meeting: APNIC PDP update
APNIC106 views
cis5-Project-11a-Harry Lai von harrylai126
cis5-Project-11a-Harry Laicis5-Project-11a-Harry Lai
cis5-Project-11a-Harry Lai
harrylai1269 views

Deep Learning, an interactive introduction for NLP-ers

  • 1. @graphific Roelof Pieters Introduc0on  to  
 Deep  Learning  for  NLP 22  January  2015  
 Stockholm  Natural  Language  Processing  Meetup FEEDA Slides at:
 http://www.slideshare.net/roelofp/220115dlmeetup 1
  • 3. A couple of headlines… [all November ’14] 3
  • 5. Machine Learning ?? - Audience Check - 5
  • 6. • “Brain” inspired / simulations: • vision: make learning algorithms 
 better and easier to use • goal: revolutions in (practical) 
 advances for machine learning and AI • Deep Learning = subfield of Machine Learning Deep Learning ?? 6
  • 10. DL: Impact 10 Deep Learning for the win! a few examples: • IJCNN 2011 Traffic Sign Recognition Competition • ISBI 2012 Segmentation of neuronal structures in EM stacks challenge • ICDAR 2011 Chinese handwriting recognition
  • 11. • Deals with “construction and study of systems that can learn from data” Machine Learning ?? A computer program is said to learn from experience (E) with respect to some class of tasks (T) and performance measure (P), if its performance at tasks in T, as measured by P, improves with experience E — T. Mitchell 1997 11
  • 12. Machine Learning ?? Traditional Programming: Data Program Output Data Program Output Machine Learning: 12
  • 13. Supervised (inductive) learning • Training data includes desired outputs Unsupervised learning • Training data does not include desired outputs Semi-supervised learning • Training data includes a few desired outputs Reinforcement learning • Rewards from sequence of actions Types of Learning 13
  • 14. ML: Traditional Approach 1. Gather as much LABELED data as you can get 2. Throw some algorithms at it (mainly put in an SVM and keep it at that) 3. If you actually have tried more algos: Pick the best 4. Spend hours hand engineering some features / feature selection / dimensionality reduction (PCA, SVD, etc) 5. Repeat… For each new problem/question:: 14
  • 15. Machine Learning for NLP Data Classic Approach: Data is fed into a learning algorithm: Learning 
 Algorithm 15
  • 16. Machine Learning for NLP some of the (many) treebank datasets source: http://www-nlp.stanford.edu/links/statnlp.html#Treebanks ! 16
  • 17. Penn Treebank That’s a lot of “manual” work: 17
  • 18. • the students went to class DT NN VB P NN • plays well with others VB ADV P NN NN NN P DT • fruit flies like a banana NN NN VB DT NN NN VB P DT NN NN NN P DT NN NN VB VB DT NN With a lot of issues: Penn Treebank 18
  • 19. Machine Learning for NLP Learning 
 Algorithm Data “Features” Prediction Prediction/
 Classifier train set test set 19
  • 20. Machine Learning for NLP Learning 
 Algorithm “Features” Prediction Prediction/
 Classifier train set test set 20
  • 21. Machine Learning for NLP • Until the early 1990’s, NLP systems were built manually with hand-crafted dictionaries and rules. • As large electronic text corpora became increasingly available, researchers began using machine learning techniques to automatically build NLP systems. • Today, the vast majority of NLP systems use machine learning. 21
  • 22. 2. Neural Networks
 and a short history lesson 22
  • 23. Perceptron (1957) Frank Rosenblatt 
 (1928-1971) Original Perceptron Simplified model: (From Perceptrons by M. L Minsky and S. Papert, 1969, Cambridge, MA: MIT Press. Copyright 1969 by MIT Press. 23
  • 24. Perceptron (1957) Perceptron Research, youtube clip: 
 https://www.youtube.com/watch?v=cNxadbrN_aI&feature=youtu.be&t=12 24
  • 27. Neuron Model All you need to know: 27
  • 29. Backpropagation (1974/1986) 1974 Paul Werbos’ invents Backpropagation algorithm for NN 1986 Backdrop popularized by Rumelhart, Hinton, Williams 1990: Renewed Interest in NN’s 29
  • 30. Backprop Renaissance Forward Propagation • Sum inputs, produce activation, feed-forward 30
  • 31. Backprop Renaissance Back Propagation (of error) • Calculate total error at the top • Calculate contributions to error at each step going backwards 31
  • 32. • Compute gradient of example-wise loss wrt parameters • Simply applying the derivative chain rule wisely 
 
 
 • If computing the loss (example, parameters) is O(n) computation, then so is computing the gradient Backpropagation 32
  • 34. Training procedure • Initialize randomly • Sequentially give it data. • See what the difference is between network output and actual output. • Update the weights according to this error. • End result: give a model input, and it produces a proper output. Quest for the weights. The weights are the model! To reiterate: 34
  • 35. So why only now? • Inspired by the architectural depth of the brain, researchers wanted for decades to train deep multi-layer neural networks. • No successful attempts were reported before 2006 …Exception: convolutional neural networks, LeCun 1998 • SVM: Vapnik and his co-workers developed the Support Vector Machine (1993) (shallow architecture). • Breakthrough in 2006! 35
  • 36. 2006 Breakthrough • More data • Faster hardware: GPU’s, multi-core CPU’s • Working ideas on how to train deep architectures 36
  • 37. 2006 Breakthrough • More data • Faster hardware: GPU’s, multi-core CPU’s • Working ideas on how to train deep architectures 37
  • 39. 2006 Breakthrough • More data • Faster hardware: GPU’s, multi-core CPU’s • Working ideas on how to train deep architectures 39
  • 41. 2006 Breakthrough • More data • Faster hardware: GPU’s, multi-core CPU’s • Working ideas on how to train deep architectures 41
  • 42. 2006 Breakthrough Stacked Restricted Boltzman Machines* (RBM) Hinton, G. E, Osindero, S., and Teh, Y. W. (2006).
 A fast learning algorithm for deep belief nets.
 Neural Computation, 18:1527-1554. Stacked Autoencoders (AE) Bengio, Y., Lamblin, P., Popovici, P., Larochelle, H. (2007).
 Greedy Layer-Wise Training of Deep Networks,
 Advances in Neural Information Processing Systems 19 * called Deep Belief Networks (DBN)
42
  • 44. 44
  • 46. No More Handcrafted Features ! 46
  • 47. — Andrew Ng “I’ve worked all my life in Machine Learning, and I’ve never seen one algorithm knock over benchmarks like Deep Learning” Deep Learning: Why? 47
  • 48. Biological Justification Deep Learning = Brain “inspired”
 Audio/Visual Cortex has multiple stages == Hierarchical • Computational Biology • CVAP • Jorge Dávila-Chacón • “that guy” “Brainiacs” “Pragmatists”vs 48
  • 49. Different Levels of Abstraction 49
  • 50. Hierarchical Learning • Natural progression from low level to high level structure as seen in natural complexity Different Levels of Abstraction Feature Representation 50
  • 51. Hierarchical Learning • Natural progression from low level to high level structure as seen in natural complexity • Easier to monitor what is being learnt and to guide the machine to better subspaces Different Levels of Abstraction Feature Representation 51
  • 52. Hierarchical Learning • Natural progression from low level to high level structure as seen in natural complexity • Easier to monitor what is being learnt and to guide the machine to better subspaces • A good lower level representation can be used for many distinct tasks Different Levels of Abstraction Feature Representation 52
  • 53. Hierarchical Learning • Natural progression from low level to high level structure as seen in natural complexity • Easier to monitor what is being learnt and to guide the machine to better subspaces • A good lower level representation can be used for many distinct tasks Different Levels of Abstraction Feature Representation 53
  • 54. • Shared Low Level Representations • Multi-Task Learning • Unsupervised Training Generalizable Learning 54
  • 55. • Shared Low Level Representations • Multi-Task Learning • Unsupervised Training • Partial Feature Sharing • Mixed Mode Learning • Composition of Functions Generalizable Learning 55
  • 56. Classic Deep Architecture Input layer Hidden layers Output layer 56
  • 57. Modern Deep Architecture Input layer Hidden layers Output layer 57
  • 58. Deep Learning: Why? (again) Beat state of the art in many areas: • Language Modeling (2012, Mikolov et al) • Image Recognition (Krizhevsky won 2012 ImageNet competition) • Sentiment Classification (2011, Socher et al) • Speech Recognition (2010, Dahl et al) • MNIST hand-written digit recognition (Ciresan et al, 2010) 58
  • 59. One Model rules them all ?
 
 DL approaches have been successfully applied to: Deep Learning: Why for NLP ? Automatic summarization Coreference resolution Discourse analysis Machine translation Morphological segmentation Named entity recognition (NER) Natural language generation Natural language understanding Optical character recognition (OCR) Part-of-speech tagging Parsing Question answering Relationship extraction sentence boundary disambiguation Sentiment analysis Speech recognition Speech segmentation Topic segmentation and recognition Word segmentation Word sense disambiguation Information retrieval (IR) Information extraction (IE) Speech processing 59
  • 60. - COFFEE BREAK - after the break we return with: CODE Download the code samples already now from: https://github.com/graphific/DL-Meetup-intro http://goo.gl/abX1E2shortened url: 
 60
  • 61. • Deep Neural Network • Multilayer Perceptron (MLP) or Artificial Neural Network (ANN) 1. MLP Logistic regression Training regime: 
 Stochastic Gradient Descent (SGD) with minibatches MNIST dataset Simple hidden layer 61
  • 62. 2. Convolutional Neural Network 62 from: Krizhevsky, Sutskever, Hinton. (2012). ImageNet Classification with Deep Convolutional Neural Networks [breakthrough in object recognition, Imagenet 2012]
  • 64. Thats it, no more code! (for now) 64
  • 65. Deep Learning: Future Developments Currently an explosion of developments • Hessian-Free networks (2010) • Long Short Term Memory (2011) • Large Convolutional nets, max-pooling (2011) • Nesterov’s Gradient Descent (2013) Currently state of the art but... • No way of doing logical inference (extrapolation) • No easy integration of abstract knowledge • Hypothetic space bias might not conform with reality 65
  • 66. Deep Learning: Future Challenges a 66 Szegedy, C., Wojciech, Z., Sutskever, I., Bruna, J., Erhan, D., Goodfellow, I., Fergus, R. (2013) Intriguing properties of neural networks L: correctly identified, Center: added noise x10, R: “Ostrich”
  • 67. • cuda-convnet2 (Alex Krizhevsky, Toronto) (c++/ CUDA, optimized for GTX 580) 
 https://code.google.com/p/cuda-convnet2/ • Caffe (Berkeley) (Cuda/OpenCL, Theano, Python)
 http://caffe.berkeleyvision.org/ • OverFeat (NYU) 
 http://cilvr.nyu.edu/doku.php?id=code:start Wanna Play ?
  • 68. • Theano - CPU/GPU symbolic expression compiler in python (from LISA lab at University of Montreal). http:// deeplearning.net/software/theano/ • Pylearn2 - library designed to make machine learning research easy. http://deeplearning.net/software/pylearn2/ • Torch - Matlab-like environment for state-of-the-art machine learning algorithms in lua (from Ronan Collobert, Clement Farabet and Koray Kavukcuoglu) http://torch.ch/ • more info: http://deeplearning.net/software links/ Wanna Play ? Wanna Play ?
  • 69. as PhD candidate KTH/CSC: “Always interested in discussing Machine Learning, Deep Architectures, Graphs, and Language Technology” In touch! roelof@kth.se www.csc.kth.se/~roelof/ Internship / EntrepeneurshipAcademic/Research as CIO/CTO Feeda: “Always looking for additions to our 
 brand new R&D team”
 
 [Internships upcoming on 
 KTH exjobb website…] roelof@feeda.com www.feeda.com Feeda 69
  • 70. Were Hiring! roelof@feeda.com www.feeda.com Feeda • Dev Ops • Software Developers • Data Scientists 70
  • 72. 72 Can’t get enough? Come to my talk Tomorrow (friday) Description on KTH website Visual-Semantic Embeddings: 
 some thoughts on Language Roelof Pieters TCS/CSC Friday jan 23 13:30. Room 304, Teknikringen 14 level 3
  • 73. Appendum Some of the exciting recent developments in NLP
 especially Distributed Semantics 73
  • 74. Word Embeddings: Turian (2010) Turian, J., Ratinov, L., Bengio, Y. (2010). Word representations: A simple and general method for semi-supervised learning code & info: http://metaoptimize.com/projects/wordreprs/74
  • 75. Word Embeddings: Turian (2010) Turian, J., Ratinov, L., Bengio, Y. (2010). Word representations: A simple and general method for semi-supervised learning code & info: http://metaoptimize.com/projects/wordreprs/75
  • 76. Word Embeddings: Collobert & Weston (2011) Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P. (2011) . Natural Language Processing (almost) from Scratch 76
  • 77. Multi-embeddings: Stanford (2012) Eric H. Huang, Richard Socher, Christopher D. Manning, Andrew Y. Ng 
 Improving Word Representations via Global Context and Multiple Word Prototypes 77
  • 78. Linguistic Regularities: Mikolov (2013) code & info: https://code.google.com/p/word2vec/ Mikolov, T., Yih, W., & Zweig, G. (2013). Linguistic Regularities in Continuous Space Word Representations 78
  • 79. Word Embeddings for MT: Mikolov (2013) Mikolov, T., Le, V. L., Sutskever, I. (2013) . Exploiting Similarities among Languages for Machine Translation 79
  • 80. Recursive Deep Models & Sentiment: Socher (2013) Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Chris Manning, Andrew Ng and Chris Potts. 2013. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. EMNLP 2013 code & demo: http://nlp.stanford.edu/sentiment/index.html80
  • 81. Paragraph Vectors: Le & Mikolov (2014) Le, Q., Mikolov,. T. (2014) Distributed Representations of Sentences and Documents 81 • add context (sentence, paragraph, document) to word vectors during training ! Results on Stanford Sentiment 
 Treebank dataset:
  • 82. Global Vectors, GloVe: Stanford (2014) Pennington, P., Socher, R., Manning,. D.M. (2014). GloVe: Global Vectors for Word Representation code & demo: http://nlp.stanford.edu/projects/glove/ vs results on the word analogy task “similar accuracy” 82
  • 83. Dependency-based Embeddings: Levy & Goldberg (2014) Levy, O., Goldberg, Y. (2014). Dependency-Based Word Embeddings code & demo: https://levyomer.wordpress.com/2014/04/25/ dependency-based-word-embeddings/ - Syntactic Dependency Context Australian scientist discovers star with telescope - Bag of Words (BoW) Context 0.3$ 0.4$ 0.5$ 0.6$ 0.7$ 0.8$ 0.9$ 1$ 0$ 0.1$ 0.2$ 0.3$ 0.4$ 0.5$ 0.6$ 0.7$ 0.8$ 0.9$ 1$ Precision$ Recall$ “Dependency-based embeddings have more functional similarities” 83