SlideShare ist ein Scribd-Unternehmen logo
1 von 32
Sparse Distributed Representations:
Our Brain’s Data Structure
Numenta Workshop
October 17, 2014
Subutai Ahmad, VP Research
sahmad@numenta.com
Sparse Distributed Representations:
Our Brain’s Data Structure
Numenta Workshop
October 17, 2014
Subutai Ahmad, VP Research
sahmad@numenta.com
The Role of Sparse Distributed Representations in Cortex
1) Sensory perception
3) Motor control
4) Prediction
2) Planning
5) Attention
Sparse Distribution Representations (SDRs) are the foundation for all these
functions, across all sensory modalities
Analysis of this common cortical data structure can provide a rigorous
foundation for cortical computing
Talk Outline
1) Introduction to Sparse Distributed Representations (SDRs)
2) Fundamental properties of SDRs
– Error bounds
– Scaling laws
From: Prof. Hasan, Max-Planck-
Institut for Research
Basics Attributes of SDRs
1) Only a small number of neurons are firing
at any point in time
3) Every cell represents something and has
meaning
4) Information is distributed and no single
neuron is critical
2) There are a very large number of neurons
5) Every neuron only connects to a subset of
other neurons
6) SDRs enable extremely fast computation
7) SDRs are binary
x = 0100000000000000000100000000000110000000
Multiple input SDR’s Single bit in an output SDR
How Does a Single Neuron Operate on SDRs?
Proximal segments
represent dozens of
separate patterns in a single
segment
How Does a Single Neuron Operate on SDRs?
Hundreds of distal segments each detect a
unique SDR using a threshold
Feedback SDR
Context SDR
Bottom-up input SDR
In both cases each synapse corresponds to one bit in
the incoming high dimensional SDR
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent dynamic set of patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Notation
• We represent a SDR vector as a vector with n binary values
where each bit represents the activity of a single neuron:
• s = percent of ON bits, w = number of ON bits
x =[b0,… ,bn-1]
wx = s ´ n = x 1
Example
• n = 40, s = 0.1, w = 4
• Typical range of numbers in HTM implementations:
n = 2048 to 65,536 s = 0.05% to 2% w = 40
y =1000000000000000000100000000000110000000
x = 0100000000000000000100000000000110000000
SDRs Have Extremely High Capacity
• The number of unique patterns that can be represented is:
• This is far smaller than 2n, but far larger than any reasonable need
• Example: with n = 2048 and w = 40,
the number of unique patterns is > 1084 >> # atoms in universe
• Chance that two random vectors are identical is essentially zero:
n
w
æ
èç
ö
ø÷ =
n!
w! n - w( )!
1/
n
w
æ
èç
ö
ø÷
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Similarity Metric for Recognition of SDR Patterns
• We don’t use typical vector similarities
– Neurons cannot compute Euclidean or Hamming distance between SDRs
– Any p-norm requires full connectivity
• Compute similarity using an overlap metric
– The overlap is simply the number of bits in common
– Requires only minimal connectivity
– Mathematically, take the AND of two vectors and compute its length
• Detecting a “Match”
– Two SDR vectors “match” if their overlap meets a minimum threshold
overlap(x,y) º x Ù y
match(x,y) º overlap(x,y) ³q
q
Overlap example
• N=40, s=0.1, w=4
• The two vectors have an overlap of 3, so they “match” if the
threshold is 3.
y =1000000000000000000100000000000110000000
x = 0100000000000000000100000000000110000000
How Accurate is Matching With Noise?
• As you decrease the match threshold , you decrease sensitivity and increase
robustness to noise
• You also increase the chance of false positives
Decrease
q
q
How Many Vectors Match When You Decrease the Threshold?
• Define the “overlap set of x” to be the set of
vectors with exactly b bits of overlap with x
• The number of such vectors is:
Wx (n,w,b) =
wx
b
æ
èç
ö
ø÷ ´
n - wx
w - b
æ
èç
ö
ø÷
Wx (n,w,b)
Number subsets of x with
exactly b bits ON
Number patterns occupying the rest
of the vector with exactly w-b bits ON
Error Bound for Classification with Noise
• Give a single stored pattern, probability of false positive is:
• Given M patterns, probability of a false positive is:
fpw
n
(q) =
Wx (n,w,b)
b=q
w
å
n
w
æ
èç
ö
ø÷
fpX (q) £ fpwxi
n
(q)
i=0
M-1
å
What Does This Mean in Practice?
• With SDRs you can classify a huge number of patterns with substantial noise
(if n and w are large enough)
Examples
• n = 2048, w = 40
With up to 14 bits of noise (33%), you can classify a quadrillion
patterns with an error rate of less than 10-24
With up to 20 bits of noise (50%), you can classify a quadrillion
patterns with an error rate of less than 10-11
• n = 64, w=12
With up to 4 bits of noise (33%), you can classify 10 patterns
with an error rate of 0.04%
Neurons Are Highly Robust Pattern Recognizers
Hundreds of distal segments each detect a
unique SDR using a threshold
You can have tens of thousands of neurons examining a single input SDR, and very
robustly matching complex patterns
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
SDRs are Robust to Random Deletions
• In cortex bits in an SDR can randomly disappear
– Synapses can be quite unreliable
– Individual neurons can die
– A patch of cortex can be damaged
• The analysis for random deletions is very similar to noise
• SDRs can naturally handle fairly significant random failures
– Failures are tolerated in any SDR and in any part of the system
• This is a great property for those building HTM based hardware
– The probability of failures can be exactly characterized
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Representing Multiple Patterns in a Single SDR
• There are situations where we want to store multiple patterns within a single SDR
and match them
• In temporal inference the system might make multiple predictions about the future
Example
Unions of SDRs
• We can store a set of patterns in a single fixed representation by taking the OR of
all the individual patterns
• The vector representing the union is also going to match a large number of other
patterns that were not one of the original 10
• How many such patterns can we store reliably, without a high chance of false
positives?
Is this SDR
a member?
1)
2)
3)
….
10)
2%
< 20%Union
Error Bounds for Unions
• Expected number of ON bits:
• Give a union of M patterns, the expected probability of a false positive (with
noise) is:
What Does This Mean in Practice?
• You can form reliable unions of a reasonable number of patterns (assuming
large enough n and w)
Examples
• n = 2048, w = 40
The union of 50 patterns leads to an error rate of 10-9
• n = 512, w=10
The union of 50 patterns leads to an error rate of 0.9%
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
SDRs Enable Highly Efficient Operations
• In cortex complex operations are carried out rapidly
– Visual system can perform object recognition in 100-150 msecs
• SDR vectors are large, but all operations are O(w) and independent of
vector size
– No loops or optimization process required
• Matching a pattern against a dynamic list (unions) is O(w) and
independent of the number of items in the list
• Enables a tiny dendritic segment to perform robust pattern recognition
• We can simulate 200,000 neurons in software at about 25-50Hz
Summary
• SDR’s are the common data structure in the cortex
• SDR’s enable flexible recognition systems that have very high capacity, and are
robust to a large amount of noise
• The union property allows a fixed representation to encode a dynamically
changing set of patterns
• The analysis of SDR’s provides a principled foundation for characterizing the
behavior of the HTM learning algorithms and all cognitive functions
• Sparse memory (Kanerva), Sparse coding (Olshausen), Bloom filters (Broder)
Related work
Questions? Math jokes?
Follow us on Twitter @numenta
Sign up for our newsletter at www.numenta.com
Subutai Ahmad
sahmad@numenta.com
nupic-theory mailing list
numenta.org/lists

Weitere ähnliche Inhalte

Was ist angesagt?

Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Numenta
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...Numenta
 
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Numenta
 
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Numenta
 
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Numenta
 
Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine LearningNumenta
 
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Numenta
 
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ... Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ...
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...Numenta
 
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Numenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...Christy Maver
 
RNN Explore
RNN ExploreRNN Explore
RNN ExploreYan Kang
 
Artificial neural networks and its applications
Artificial neural networks and its applications Artificial neural networks and its applications
Artificial neural networks and its applications PoojaKoshti2
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: PointersFariz Darari
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural NetworksDean Wyatte
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications Ahmed_hashmi
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its applicationHưng Đặng
 
Neural Networks
Neural Networks Neural Networks
Neural Networks Eric Su
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKSESCOM
 
Neural networks
Neural networksNeural networks
Neural networksBasil John
 

Was ist angesagt? (20)

Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
 
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
 
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
 
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
 
Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine Learning
 
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
 
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ... Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ...
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
RNN Explore
RNN ExploreRNN Explore
RNN Explore
 
Neural networks
Neural networksNeural networks
Neural networks
 
Artificial neural networks and its applications
Artificial neural networks and its applications Artificial neural networks and its applications
Artificial neural networks and its applications
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: Pointers
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural Networks
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
Neural Networks
Neural Networks Neural Networks
Neural Networks
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKS
 
Neural networks
Neural networksNeural networks
Neural networks
 

Andere mochten auch

Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligenceNumenta
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow modelsjtoy
 
What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence Numenta
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisNumenta
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisNumenta
 
Getting Started with Numenta Technology
Getting Started with Numenta Technology Getting Started with Numenta Technology
Getting Started with Numenta Technology Numenta
 
Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Numenta
 
Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPICNumenta
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial PoolerNumenta
 
TouchNet preview at Numenta
TouchNet preview at NumentaTouchNet preview at Numenta
TouchNet preview at Numentajtoy
 

Andere mochten auch (10)

Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine Intelligence
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow models
 
What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus Lewis
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
 
Getting Started with Numenta Technology
Getting Started with Numenta Technology Getting Started with Numenta Technology
Getting Started with Numenta Technology
 
Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)
 
Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPIC
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial Pooler
 
TouchNet preview at Numenta
TouchNet preview at NumentaTouchNet preview at Numenta
TouchNet preview at Numenta
 

Ähnlich wie Sparse Distributed Representations: Our Brain's Data Structure

Hierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionHierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionIhor Bobak
 
is2015_poster
is2015_posteris2015_poster
is2015_posterJan Svec
 
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGmohanapriyastp
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationMohammed Bennamoun
 
Introduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMAIntroduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMABidhan Ghimire
 
Wits presentation 6_28072015
Wits presentation 6_28072015Wits presentation 6_28072015
Wits presentation 6_28072015Beatrice van Eden
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...Chester Chen
 
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingTed Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingMLconf
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)James Boulie
 
Lecture on Deep Learning
Lecture on Deep LearningLecture on Deep Learning
Lecture on Deep LearningYasas Senarath
 
Neural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep LearningNeural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep Learningcomifa7406
 
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationInformation Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationXavier Anguera
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learningViet-Trung TRAN
 
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Peter Morovic
 
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Alexander Gorban
 
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksFeasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksSangjun Han
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Förderverein Technische Fakultät
 
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...Ana Marasović
 

Ähnlich wie Sparse Distributed Representations: Our Brain's Data Structure (20)

Hierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionHierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly Detection
 
is2015_poster
is2015_posteris2015_poster
is2015_poster
 
CNN for modeling sentence
CNN for modeling sentenceCNN for modeling sentence
CNN for modeling sentence
 
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
 
Introduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMAIntroduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMA
 
Wits presentation 6_28072015
Wits presentation 6_28072015Wits presentation 6_28072015
Wits presentation 6_28072015
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...
 
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingTed Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)
 
Lecture on Deep Learning
Lecture on Deep LearningLecture on Deep Learning
Lecture on Deep Learning
 
Neural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep LearningNeural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep Learning
 
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationInformation Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learning
 
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
 
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
 
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksFeasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
 
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
 
Semeval Deep Learning In Semantic Similarity
Semeval Deep Learning In Semantic SimilaritySemeval Deep Learning In Semantic Similarity
Semeval Deep Learning In Semantic Similarity
 

Mehr von Numenta

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesNumenta
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyNumenta
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiNumenta
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Numenta
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Numenta
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Numenta
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenNumenta
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...Numenta
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroNumenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...Numenta
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...Numenta
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)Numenta
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AINumenta
 

Mehr von Numenta (13)

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devices
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AI
 

Kürzlich hochgeladen

Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)Cathrine Wilhelmsen
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanMYRABACSAFRA2
 
Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our WorldEduminds Learning
 
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default  Presentation : Data Analysis Project PPTPredictive Analysis for Loan Default  Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPTBoston Institute of Analytics
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 217djon017
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhYasamin16
 
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...Boston Institute of Analytics
 
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBoston Institute of Analytics
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxaleedritatuxx
 
专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改
专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改
专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改yuu sss
 
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdfEnglish-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdfblazblazml
 
Cyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded dataCyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded dataTecnoIncentive
 
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024Susanna-Assunta Sansone
 
Decoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectDecoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectBoston Institute of Analytics
 
SMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptxSMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptxHaritikaChhatwal1
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesTimothy Spann
 
Top 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In QueensTop 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In Queensdataanalyticsqueen03
 
办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degree
办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degree办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degree
办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degreeyuu sss
 
RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.natarajan8993
 

Kürzlich hochgeladen (20)

Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population Mean
 
Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our World
 
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default  Presentation : Data Analysis Project PPTPredictive Analysis for Loan Default  Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPT
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
 
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
 
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
 
专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改
专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改
专业一比一美国俄亥俄大学毕业证成绩单pdf电子版制作修改
 
Insurance Churn Prediction Data Analysis Project
Insurance Churn Prediction Data Analysis ProjectInsurance Churn Prediction Data Analysis Project
Insurance Churn Prediction Data Analysis Project
 
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdfEnglish-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
English-8-Q4-W3-Synthesizing-Essential-Information-From-Various-Sources-1.pdf
 
Cyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded dataCyber awareness ppt on the recorded data
Cyber awareness ppt on the recorded data
 
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
FAIR, FAIRsharing, FAIR Cookbook and ELIXIR - Sansone SA - Boston 2024
 
Decoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectDecoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis Project
 
SMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptxSMOTE and K-Fold Cross Validation-Presentation.pptx
SMOTE and K-Fold Cross Validation-Presentation.pptx
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
 
Top 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In QueensTop 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In Queens
 
办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degree
办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degree办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degree
办美国阿肯色大学小石城分校毕业证成绩单pdf电子版制作修改#真实留信入库#永久存档#真实可查#diploma#degree
 
RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.
 

Sparse Distributed Representations: Our Brain's Data Structure

  • 1. Sparse Distributed Representations: Our Brain’s Data Structure Numenta Workshop October 17, 2014 Subutai Ahmad, VP Research sahmad@numenta.com
  • 2.
  • 3.
  • 4. Sparse Distributed Representations: Our Brain’s Data Structure Numenta Workshop October 17, 2014 Subutai Ahmad, VP Research sahmad@numenta.com
  • 5. The Role of Sparse Distributed Representations in Cortex 1) Sensory perception 3) Motor control 4) Prediction 2) Planning 5) Attention Sparse Distribution Representations (SDRs) are the foundation for all these functions, across all sensory modalities Analysis of this common cortical data structure can provide a rigorous foundation for cortical computing
  • 6. Talk Outline 1) Introduction to Sparse Distributed Representations (SDRs) 2) Fundamental properties of SDRs – Error bounds – Scaling laws
  • 7. From: Prof. Hasan, Max-Planck- Institut for Research
  • 8. Basics Attributes of SDRs 1) Only a small number of neurons are firing at any point in time 3) Every cell represents something and has meaning 4) Information is distributed and no single neuron is critical 2) There are a very large number of neurons 5) Every neuron only connects to a subset of other neurons 6) SDRs enable extremely fast computation 7) SDRs are binary x = 0100000000000000000100000000000110000000
  • 9. Multiple input SDR’s Single bit in an output SDR How Does a Single Neuron Operate on SDRs?
  • 10. Proximal segments represent dozens of separate patterns in a single segment How Does a Single Neuron Operate on SDRs? Hundreds of distal segments each detect a unique SDR using a threshold Feedback SDR Context SDR Bottom-up input SDR In both cases each synapse corresponds to one bit in the incoming high dimensional SDR
  • 11. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent dynamic set of patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 12. Notation • We represent a SDR vector as a vector with n binary values where each bit represents the activity of a single neuron: • s = percent of ON bits, w = number of ON bits x =[b0,… ,bn-1] wx = s ´ n = x 1 Example • n = 40, s = 0.1, w = 4 • Typical range of numbers in HTM implementations: n = 2048 to 65,536 s = 0.05% to 2% w = 40 y =1000000000000000000100000000000110000000 x = 0100000000000000000100000000000110000000
  • 13. SDRs Have Extremely High Capacity • The number of unique patterns that can be represented is: • This is far smaller than 2n, but far larger than any reasonable need • Example: with n = 2048 and w = 40, the number of unique patterns is > 1084 >> # atoms in universe • Chance that two random vectors are identical is essentially zero: n w æ èç ö ø÷ = n! w! n - w( )! 1/ n w æ èç ö ø÷
  • 14. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 15. Similarity Metric for Recognition of SDR Patterns • We don’t use typical vector similarities – Neurons cannot compute Euclidean or Hamming distance between SDRs – Any p-norm requires full connectivity • Compute similarity using an overlap metric – The overlap is simply the number of bits in common – Requires only minimal connectivity – Mathematically, take the AND of two vectors and compute its length • Detecting a “Match” – Two SDR vectors “match” if their overlap meets a minimum threshold overlap(x,y) º x Ù y match(x,y) º overlap(x,y) ³q q
  • 16. Overlap example • N=40, s=0.1, w=4 • The two vectors have an overlap of 3, so they “match” if the threshold is 3. y =1000000000000000000100000000000110000000 x = 0100000000000000000100000000000110000000
  • 17. How Accurate is Matching With Noise? • As you decrease the match threshold , you decrease sensitivity and increase robustness to noise • You also increase the chance of false positives Decrease q q
  • 18. How Many Vectors Match When You Decrease the Threshold? • Define the “overlap set of x” to be the set of vectors with exactly b bits of overlap with x • The number of such vectors is: Wx (n,w,b) = wx b æ èç ö ø÷ ´ n - wx w - b æ èç ö ø÷ Wx (n,w,b) Number subsets of x with exactly b bits ON Number patterns occupying the rest of the vector with exactly w-b bits ON
  • 19. Error Bound for Classification with Noise • Give a single stored pattern, probability of false positive is: • Given M patterns, probability of a false positive is: fpw n (q) = Wx (n,w,b) b=q w å n w æ èç ö ø÷ fpX (q) £ fpwxi n (q) i=0 M-1 å
  • 20. What Does This Mean in Practice? • With SDRs you can classify a huge number of patterns with substantial noise (if n and w are large enough) Examples • n = 2048, w = 40 With up to 14 bits of noise (33%), you can classify a quadrillion patterns with an error rate of less than 10-24 With up to 20 bits of noise (50%), you can classify a quadrillion patterns with an error rate of less than 10-11 • n = 64, w=12 With up to 4 bits of noise (33%), you can classify 10 patterns with an error rate of 0.04%
  • 21. Neurons Are Highly Robust Pattern Recognizers Hundreds of distal segments each detect a unique SDR using a threshold You can have tens of thousands of neurons examining a single input SDR, and very robustly matching complex patterns
  • 22. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 23. SDRs are Robust to Random Deletions • In cortex bits in an SDR can randomly disappear – Synapses can be quite unreliable – Individual neurons can die – A patch of cortex can be damaged • The analysis for random deletions is very similar to noise • SDRs can naturally handle fairly significant random failures – Failures are tolerated in any SDR and in any part of the system • This is a great property for those building HTM based hardware – The probability of failures can be exactly characterized
  • 24. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 25. Representing Multiple Patterns in a Single SDR • There are situations where we want to store multiple patterns within a single SDR and match them • In temporal inference the system might make multiple predictions about the future Example
  • 26. Unions of SDRs • We can store a set of patterns in a single fixed representation by taking the OR of all the individual patterns • The vector representing the union is also going to match a large number of other patterns that were not one of the original 10 • How many such patterns can we store reliably, without a high chance of false positives? Is this SDR a member? 1) 2) 3) …. 10) 2% < 20%Union
  • 27. Error Bounds for Unions • Expected number of ON bits: • Give a union of M patterns, the expected probability of a false positive (with noise) is:
  • 28. What Does This Mean in Practice? • You can form reliable unions of a reasonable number of patterns (assuming large enough n and w) Examples • n = 2048, w = 40 The union of 50 patterns leads to an error rate of 10-9 • n = 512, w=10 The union of 50 patterns leads to an error rate of 0.9%
  • 29. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 30. SDRs Enable Highly Efficient Operations • In cortex complex operations are carried out rapidly – Visual system can perform object recognition in 100-150 msecs • SDR vectors are large, but all operations are O(w) and independent of vector size – No loops or optimization process required • Matching a pattern against a dynamic list (unions) is O(w) and independent of the number of items in the list • Enables a tiny dendritic segment to perform robust pattern recognition • We can simulate 200,000 neurons in software at about 25-50Hz
  • 31. Summary • SDR’s are the common data structure in the cortex • SDR’s enable flexible recognition systems that have very high capacity, and are robust to a large amount of noise • The union property allows a fixed representation to encode a dynamically changing set of patterns • The analysis of SDR’s provides a principled foundation for characterizing the behavior of the HTM learning algorithms and all cognitive functions • Sparse memory (Kanerva), Sparse coding (Olshausen), Bloom filters (Broder) Related work
  • 32. Questions? Math jokes? Follow us on Twitter @numenta Sign up for our newsletter at www.numenta.com Subutai Ahmad sahmad@numenta.com nupic-theory mailing list numenta.org/lists