SlideShare ist ein Scribd-Unternehmen logo
1 von 62
Downloaden Sie, um offline zu lesen
파 썬으로 구현하는
신경세포 기반의
인공 뇌 시뮬레이터
김성현
저는 인간의 학습 원리와 최신 딥러닝 기술을 융합하여
AGI (Artificial general intelligence) 를 개발하는
꿈을 가지고 있습니다.
✉ bananaband657@gmail.com
🏠 https://banana-media-lab.tistory.com
https://github.com/MrBananaHuman
Introduction01
- Introduction to neuroscience
Spiking Neural Network (SNN)02
- SNN as a neuromorphic neural network model
Modeling of SNN03
- Python Nengo library for SNN modeling
Applications of SNN04
- Deep SNN models
Future of SNN05
- Neuromorphic chip
Introduction
Introduction to neuroscience
01
Introduction
The brain is the most complex 1.5 kg organ that controls all functions of the body, interprets information from the outside
world, and embodies the essence of the mind and soul.
Thoughts
Perceptions
Language
Sensations
Memories
Actions
Emotions
Learning
Introduction
The brain is the most complex 1.5 kg organ that controls all functions of the body, interprets information from the outside
world, and embodies the essence of the mind and soul.
Thoughts
Perceptions
Language
Sensations
Memories
Actions
Emotions
Learning
History of Neuroscience - Neuron
Neuroscience is the study of how the nervous system develops, its structure, and what it does.
The first drawing of a neuron as the nerve cell (1865) [1] The first illustrated a synapse (1893, 1897) [2-3]
[1] Otto Friedrich Karl Deiters, Deiters, 1865
[2] Sherrington CS, 1897, A textbook of physiology, London:Macmillian, p.1024-70
[3] Cajal R, 1893, Arch Anat Physiol Anat Abth., V & VI:310-428
History of Neuroscience - Neuron
A typical neuron consists of a cell body (soma), dendrites, and a single axon.
[1] https://ib.bioninja.com.au/standard-level/topic-6-human-physiology/65-neurons-and-synapses/neurons.html
Synapse
Dendrite
Nucleus Soma
(Cell body)
Axon terminal
Myelin
sheath
Axon
Synapse
[1]
History of Neuroscience – Action Potential
An action potential is a rapid rise and subsequent fall in voltage or membrane potential across a cellular membrane with a
characteristic pattern.
[3]
[1] [2]
[1] How big is the GIANT Squid Giant Axon?, @TheCellularScale
[2] Hodgkin AL & Huxley AF, 1945, J Physiol
[3] https://www.moleculardevices.com/applications/patch-clamp-electrophysiology/what-action-potential#gref
History of Neuroscience - Synapse
Synapses are biological junctions through which neurons' signals can be sent to each other.
[2]
[1] https://synapseweb.clm.utexas.edu/type-1-synapse
[2] Besson, P., 2017, Doctoral dissetation
Presynaptic
neuron
Postsynaptic
neuron
Synpase
[1]
Excitatory postsynaptic potential
(EPSP)
Inhibitory postsynaptic potential
(IPSP)
History of Neuroscience - Synaptic Plasticity in Synapse
Synaptic plasticity refers to the phenomenon whereby strength of synaptic connections between neurons changes over time.
[1] M G LARRABEE, D W BRONK, 1947, J Neurophysiol.
Presynaptic neuron
Postsynaptic neuron
Before
stimulating
After
stimulating
Action potentials recorded from the
postganglionic nerve (1947) [1]
History of Neuroscience - The Brain
Neuron
Synapse
Plasticity
Dendrite
Nucleus Soma
(cell body)
Axon terminal
Myelin
sheath
Axon
• 86 Billion
• 10–25 μm
• > 1,000 types
• 7,000 syn/neuron
• 100-500 trillion
• Potentiation
• Depression
[1] https://ib.bioninja.com.au/standard-level/topic-6-human-physiology/65-neurons-and-synapses/neurons.html
[2] https://commons.wikimedia.org/w/index.php?curid=41349083
[3] https://sites.google.com/site/mcauliffeneur493/home/synaptic-plasticity
[1]
[2]
[3]
Artificial Neural Network (ANN) Revolution
[1]
ANN is abstract model that mimics the complex structure and functioning of the brain, which is developing explosively in
recent years.
[1] A brief history of neural nets and deep learning by A. Kurenkov
Limitation of ANN
Despite the success of the ANN algorithm, it has clear limitations.
Computational limitations
[1] Whittington and Bogacz, 2019, Trends in Cognitive Sciences
[2] Grossberg, 1987, Cognitive Science
[3] Lillicrap et al., 2020, Nature Review Neuroscience
• Lack of local error representation → Vanishing gradient [1]
• Symmetry of forwards and backwards weights → Weight transport problem [2]
• Feedback in brains alters neural activity [3]
• Unrealistic models of neurons → Large computational cost [1]
• Error signals are singed and potentially extreme-valued → Over fitting [3]
How Does The Brain Learn?
[1]
[1] Brainbow Hippocampus, Greg Dunn and Brian Edwards, 2014
[2] https://blogs.cardiff.ac.uk/acerringtonlab/ca1-pyramidal-neuron-red-hot/
[2]
Spiking Neural Network (SNN)
SNN as a neuromorphic neural network model
02
Overview
SNNs operate using spikes, which are discrete events that take place at points in time, rather than continuous values.
[1] Anwani and Rajendran, 2015, IJCNN
[1]
Components
• Spiking neuron model
• Synapse
• Synaptic plasticity
Spiking Neuron Model - Leaky Integrate-and-Fire (LIF)Model
A spiking neuron model is a mathematical description of the properties of certain cells in the nervous system that generate
sharp electrical potentials across their cell membrane, roughly one millisecond in duration.
[1] Teka, W. et al., 2014, PLoS Comput Biol.
[Appendix 1] https://www.youtube.com/watch?v=2_MIjvwWsrg
[Appendix 2] https://www.youtube.com/watch?v=KXnHxZdn8NU
Characteristics
• Subthreshold leaky-
integrator dynamic
• A firing threshold
• Reset mechanism
Resistor-Capacitor (RC) circuit [1]
Spiking Neuron Model - Leaky Integrate-and-Fire (LIF)Model
A spiking neuron model is a mathematical description of the properties of certain cells in the nervous system that generate
sharp electrical potentials across their cell membrane, roughly one millisecond in duration.
Characteristics
• Subthreshold leaky-
integrator dynamic
• A firing threshold
• Reset mechanism
[1] Louis Lapicque, 1907, Journal de Physiologie et de Pathologie Générale.
[Appendix 1] https://www.youtube.com/watch?v=2_MIjvwWsrg
[Appendix 2] https://www.youtube.com/watch?v=KXnHxZdn8NU
Leaky Integrate-and-Fire model [1]
Synapse Model
The synapse model activates as an input current stimulation to the spiking neuron model.
[1] Dutta, S. et al., 2017, Scientific reports
[1]
Synaptic Plasticity - Learning in the Brain
To reduce
punishment
To improve
knowledge
(reward(?))
ANN
output target
Error function
Loss function
Learning rate Error signal
SNN
• Unsupervised
learning
• Fire together,
wire together
• STDP learning
• BCM learning
• Supervised
learning
• Local error
propagation
• TP learning
• PES learning
[1] Timothy P. Lillicrap et al., 2020, Nat Rev Neurosci.
[1]
[1] [1]
Unsupervised Learning - Spike Timing Dependent Plasticity (STDP)
The Spike Timing Dependent Plasticity (STDP) algorithm, which has been observed in the mammalian brain, modulates the
weight of a synapse based on the relative timing of presynaptic and postsynaptic spikes. [1-3]
[1] Wang, R. et al., 2016, ISCAS
[2] Gerstner et al., 1996, Nature
[3] Bi and Poo, 1998, Journal of Neuroscience
[2-3]
[1]
PostPre
Spike
Spike
Time (ms)
Δt
Unsupervised Learning - Bienenstock, Cooper & Munro (BCM)
The BCM model proposes a sliding threshold for long-term potentiation (LTP) or long-term depression (LTD) induction, and
states that synaptic plasticity is stabilized by a dynamic adaptation of the time-averaged postsynaptic activity.
[1] Bienenstock, Cooper & Munro 1982 J Neurosci
Bienenstock, Cooper & Munro (BCM) learning [1]
Learning in visual cortex BCM model
Supervised Learning - Target Propagation (TP)
output target
Local layer-wise errors
Hypothesis
• The essential idea behind using a
stack of auto-encoders for deep
learning
• This backward-propagated target
induces hidden-activity targets
that should have been realized by
the network
• Learning proceeds by updating
the forward weights to minimize
these local layer-wise activity
differences
Target propagation (TP) learning [1]
[1] Timothy P. Lillicrap et al., 2020, Nat Rev Neurosci.
Supervised Learning - Prescribed Error Sensitivity (PES)
[1]
[1] Timothy P. Lillicrap et al., 2020, Nat Rev Neurosci.
[2] Voelker, A. R., 2015, Centre for Theoretical Neuroscience
Prescribed Error Sensitivity (PES) learning [2]
A connection from x to y learns to output y ∗ by minimizing |y ∗ − y|.
Modeling of Spiking Neural Network (SNN)
Python Nengo library for SNN modeling
03
Nengo Library
The Nengo Brain Maker is a Python package for building, testing, and deploying neural networks as a Neural Engineering
Framework (NEF).
[1] https://www.nengo.ai/
[1]
Nengo Library
The Nengo Brain Maker is a Python package for building, testing, and deploying neural networks as a Neural Engineering
Framework (NEF).
[1] https://www.nengo.ai/
[1]
Nengo Tutorial
Installation
Usage
Build a network
!pip install nengo
import nengo
import numpy as np
net = nengo.Network()
with net:
sin_input = nengo.Node(output=np.sin)
input_neuron = nengo.Ensemble(n_neurons=4, dimensions=1)
nengo.Connection(sin_input, input_neuron)
Node
(Sine)
Spiking Neuron Model
Characteristics
import matplotlib.pyplot as plt
%matplotlib inline
from nengo.dists import Choice
from nengo.utils.ensemble import tuning_curves
from nengo.utils.matplotlib import rasterplot
with nengo.Simulator(net) as sim:
plt.figure()
plt.plot(*tuning_curves(input_layer, sim))
plt.xlabel("input value")
plt.ylabel("firing rate")
plt.xlim(-1, 1)
plt.title(str(nengo.LIF()))
sim.run(5.0)
Neural Dynamics
intercepts=[-.5] intercepts=[0] intercepts=[.5]
Input value
Firingrate(Hz)
Characteristics
input_neuron = nengo.Ensemble(intercepts=[-.5])
Neural Dynamics
Characteristics
input_neuron = nengo.Ensemble(intercepts=[0], encoders=[[-1]])
Input value
Firingrate(Hz)
intercepts=[-.5], encoders=[[-1]] intercepts=[0], encoders=[[-1]] intercepts=[.5], encoders=[[-1]]
Neural Dynamics
Characteristics
input_neuron = nengo.Ensemble(intercepts=[0], encoders=[[-1]], max_rates=[100])
Input value
Firingrate(Hz)
max_rates=[10] max_rates=[100] max_rates=[200]
Neural Dynamics
Characteristics
input_neuron = nengo.Ensemble(intercepts=[0], encoders=[[-1]], max_rates=[100], radius=1)
Input value
Firingrate(Hz)
radius=1 radius=2 radius=10
Neural Decoding
Characteristics
with net:
sin_input = nengo.Node(np.sin)
input_layer = nengo.Ensemble(n_neurons=2,dimensions=1, intercepts=[-.5, -
.5], encoders=[[1], [-1]], max_rates = [100, 100])
nengo.Connection(sin_input, input_layer)
Input value
Firingrate(Hz)
Neural Decoding
Prober
with net:
sin_probe = nengo.Probe(sin_input)
spikes = nengo.Probe(input_layer.neurons)
filtered = nengo.Probe(input_layer, synapse=0.01)
t = sim.trange()
# Plot the spiking output of the ensemble
plt.figure(figsize=(10, 8))
plt.subplot(2, 2, 1)
rasterplot(t, sim.data[spikes], colors=[(1, 0, 0), (0, 0, 0)])
plt.yticks((1, 2), ("On neuron", "Off neuron"))
plt.ylim(2.5, 0.5)
# Plot the decoded output of the ensemble
plt.figure()
plt.plot(t, sim.data[filtered])
plt.plot(t, sim.data[sin_probe])
plt.xlim(0, 10)
Neural Decoding
Prober
with net:
sin_probe = nengo.Probe(sin_input)
spikes = nengo.Probe(input_layer.neurons)
filtered = nengo.Probe(input_layer, synapse=0.01)
t = sim.trange()
# Plot the spiking output of the ensemble
plt.figure(figsize=(10, 8))
plt.subplot(2, 2, 1)
rasterplot(t, sim.data[spikes], colors=[(1, 0, 0), (0, 0, 0)])
plt.yticks((1, 2), ("On neuron", "Off neuron"))
plt.ylim(2.5, 0.5)
# Plot the decoded output of the ensemble
plt.figure()
plt.plot(t, sim.data[filtered])
plt.plot(t, sim.data[sin_probe])
plt.xlim(0, 10)
Neural Decoding
Characteristics
net = nengo.Network()
with net:
sin_input = nengo.Node(np.sin)
input_layer = nengo.Ensemble(n_neurons=100,dimensions=1)
nengo.Connection(sin_input, input_layer)
Input value
Firingrate(Hz)
Time (s)Inputvalue
Image Processing
Input function custom
urlretrieve("http://deeplearning.net/data/mnist/mnist.pkl.gz", "mnist.pkl.gz")
with gzip.open("mnist.pkl.gz") as f:
train_data, _, test_data = pickle.load(f, encoding="latin1")
train_data = list(train_data)
def image_input(t): # MNIST image data to Model
img = train_data[0][int(t)]
return img
net = nengo.Network()
neuron_number = 28*28
with net:
input_node = nengo.Node(image_input)
pre_neuron = nengo.Ensemble(neuron_number, dimensions=neuron_number, max_rates
= [100] * neuron_number, intercepts=[0] * neuron_number)
nengo.Connection(input_node, pre_neuron)
Image Processing
Input function custom
urlretrieve("http://deeplearning.net/data/mnist/mnist.pkl.gz", "mnist.pkl.gz")
with gzip.open("mnist.pkl.gz") as f:
train_data, _, test_data = pickle.load(f, encoding="latin1")
train_data = list(train_data)
def image_input(t): # MNIST image data to Model
img = train_data[0][int(t)]
return img
net = nengo.Network()
neuron_number = 28*28
with net:
input_node = nengo.Node(image_input)
pre_neuron = nengo.Ensemble(neuron_number, dimensions=neuron_number, max_rates
= [100] * neuron_number, intercepts=[0] * neuron_number)
nengo.Connection(input_node, pre_neuron) Time (s)
Neuronnumber(28*28)
Voice Processing
Input function custom
def voice_input(t):
ms = int(t * 1000)
frame_num = int(ms / frame_size)
voice = transposed_norm_S[frame_num]
return voice
with nengo.Network() as net:
voice_input = nengo.Node(output=voice_input)
input_neuron = nengo.Ensemble(n_neurons=80, dimensions=1,
max_rates=([100] * neuron_number))
nengo.Connection(voice, input_neuron, synapse=0.01)
spike_probe = nengo.Probe(input_neuron)
Voice Processing
Input function custom
def voice_input(t):
ms = int(t * 1000)
frame_num = int(ms / frame_size)
voice = transposed_norm_S[frame_num]
return voice
with nengo.Network() as net:
voice_input = nengo.Node(output=voice_input)
input_neuron = nengo.Ensemble(n_neurons=80, dimensions=1,
max_rates=([100] * neuron_number))
nengo.Connection(voice, input_neuron, synapse=0.01)
spike_probe = nengo.Probe(input_neuron)
neurons
Unsupervised Learning
BCM learning
net = nengo.Network()
with net:
sin = nengo.Node(lambda t: np.sin(t * 4))
pre = nengo.Ensemble(100, dimensions=1)
post = nengo.Ensemble(100, dimensions=1)
nengo.Connection(sin, pre)
conn = nengo.Connection(pre, post)
conn.learning_rule_type = nengo.BCM(learning_rate=5e-10)
Supervised Learning
Without PES learning
with net:
noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1)
input_layer = nengo.Ensemble(60, dimensions=1)
output_layer = nengo.Ensemble(60, dimensions=1)
nengo.Connection(noise_input, input_layer)
conn = nengo.Connection(input_layer, output_layer)
with nengo.Simulator(model) as sim:
sim.run(10.0)
Node
(Noise)
Input Output
Supervised Learning
With PES learning
with net:
noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1)
input_layer = nengo.Ensemble(60, dimensions=1)
output_layer = nengo.Ensemble(60, dimensions=1)
nengo.Connection(noise_input, input_layer)
conn = nengo.Connection(input_layer, output_layer)
error_neuron = nengo.Ensemble(60, dimensions=1)
nengo.Connection(output_layer, error_neuron)
nengo.Connection(input_layer, error_neuron, transform=-1)
conn.learning_rule_type = nengo.PES()
nengo.Connection(error_neuron, conn.learning_rule)
with nengo.Simulator(model) as sim:
sim.run(10.0)
Node
(Noise)
Input
Error
Output
-1
Supervised Learning
With PES learning
with net:
noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1)
input_layer = nengo.Ensemble(60, dimensions=1)
output_layer = nengo.Ensemble(60, dimensions=1)
nengo.Connection(noise_input, input_layer)
conn = nengo.Connection(input_layer, output_layer)
error_neuron = nengo.Ensemble(60, dimensions=1)
nengo.Connection(output_layer, error_neuron)
nengo.Connection(input_layer, error_neuron, transform=-1)
conn.learning_rule_type = nengo.PES()
nengo.Connection(error_neuron, conn.learning_rule)
with nengo.Simulator(model) as sim:
sim.run(10.0)
Supervised Learning
With PES learning
with net:
noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1)
input_layer = nengo.Ensemble(60, dimensions=1)
output_layer = nengo.Ensemble(60, dimensions=1)
nengo.Connection(noise_input, input_layer)
conn = nengo.Connection(input_layer, output_layer)
error_neuron = nengo.Ensemble(60, dimensions=1)
nengo.Connection(output_layer, error_neuron)
nengo.Connection(input_layer, error_neuron, transform=-1)
conn.learning_rule_type = nengo.PES()
nengo.Connection(error_neuron, conn.learning_rule)
with nengo.Simulator(model) as sim:
sim.run(10.0)
Keras Model Converting
[1]
[1] https://towardsdatascience.com/mnist-handwritten-digits-classification-using-a-convolutional-neural-network-cnn-af5fafbc35e9
Keras Model Converting
MNIST model converting
converter = nengo_dl.Converter(model, epochs=2,
swap_activations={tf.nn.relu: nengo.RectifiedLinear())
with nengo_dl.Simulator(converter.net, seed=0, minibatch_size=200) as sim:
sim.compile(
optimizer=tf.optimizers.RMSprop(0.001),
loss={
converter.outputs[dense1]: tf.losses.SparseCategoricalCrossentropy(
from_logits=True
)
},
metrics={converter.outputs[dense1]: tf.metrics.sparse_categorical_accuracy},
)
sim.fit(
{converter.inputs[inp]: train_images},
{converter.outputs[dense1]: train_labels},
epochs=epochs,
)
sim.save_params("./mnist_model")
Keras Model Converting
Applications of SNN
Deep SNN models
04
Solving XOR Problem
It is known that the XOR problem cannot be solved with the traditional perceptron model but Nengo based SNN can solve the
problem with only a single layer. [1]
[2]
[1] Gidon et al., 2020, Science
[2] https://github.com/sunggukcha/xor
[3] https://www.nengo.ai/examples/
[3]
Permuted Sequential MNIST
In the Permuted Sequential MNIST data containing the order information for writing numbers, the Nengo SNN-based
(Legendre Memory Units) LMU showed SOTA performance.
[2]
[1] https://github.com/edwin-de-jong/mnist-digits-stroke-sequence-data/wiki/MNIST-digits-stroke-sequence-data
[2] Coelker, A. et al., 2019, NIPS
[3] https://www.nengo.ai/examples/
[2]
[1]
[3]
Large Scale Virtual Brain Simulation
Methods
• Semantic Pointer Architecture
Unified Network (SPAUN)
• Using Nengo
• 2.5 million LIF neurons
• Success on 8 diverse tasks
• Copy drawing style
• Image recognition
• Reinforcement learning
• Serial working memory
• Counting
• Question Answering
• Rapid variable creation
• Fluid reasoning
[1] Eliasmith et al., 2012, Science
[1]
Large Scale Virtual Brain Simulation
Methods
• Semantic Pointer Architecture
Unified Network (SPAUN)
• Using Nengo
• 2.5 million LIF neurons
• Success on 8 diverse tasks
• Copy drawing style
• Image recognition
• Reinforcement learning
• Serial working memory
• Counting
• Question Answering
• Rapid variable creation
• Fluid reasoning
[1] Eliasmith et al., 2012, Science
[1]
Future of SNN
Neuromorphic chip
05
Neuromorphic Advantages
Advantages
• Sparsification over time
→ Less communication
• Less communication
→ Fewer memory lookups
• Cheaper computation
→ Sum instead of multiply
[1] Jeehyun Kwak and Hyun Jae Jang, Neural Computation Lab (NCL), Korea Univ.
[1]
Neuromorphic Advantages
Neuromorphic Processing Unit
[1] Eliasmith and Suma, The Neuromorphic Advantage, Applied Brain Research (ABR)
[1]
Intel Loihi Chip
Neuromorphic Advantages
[1] Eliasmith and Suma, The Neuromorphic Advantage, Applied Brain Research (ABR)
[1]
Neuromorphic Advantages
[1] Eliasmith et al., 2016, arXiv
[2] Jang, H. J. et al., 2020, Science Advances
[1]
[2]
3D neuron model
Computational Neuroscience
[1] Trappenberg, T. P., 2009, Fundamentals of computational neuroscience, OUP Oxford
[1]
감사합니다 :-)
SNN와 관련된 대화는 언제나 환영합니다.
bananaband657@gmail.com

Weitere ähnliche Inhalte

Was ist angesagt?

13_Properties of Nanomaterials.pptx
13_Properties of Nanomaterials.pptx13_Properties of Nanomaterials.pptx
13_Properties of Nanomaterials.pptxRABEYABASORI
 
Introduction to density functional theory
Introduction to density functional theory Introduction to density functional theory
Introduction to density functional theory Sarthak Hajirnis
 
Junctionless Transistor
Junctionless Transistor Junctionless Transistor
Junctionless Transistor Durgarao Gundu
 
Neuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkar
Neuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkarNeuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkar
Neuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkarDr Amit Vatkar
 
P-QRS-T peak detection of ECG signal by MATLAB
P-QRS-T peak detection of ECG signal by MATLABP-QRS-T peak detection of ECG signal by MATLAB
P-QRS-T peak detection of ECG signal by MATLABDiptaRoy2
 
Piezoelectric nanogenerator
Piezoelectric nanogeneratorPiezoelectric nanogenerator
Piezoelectric nanogeneratorMuzamilg12
 
Materials Design in the Age of Deep Learning and Quantum Computation
Materials Design in the Age of Deep Learning and Quantum ComputationMaterials Design in the Age of Deep Learning and Quantum Computation
Materials Design in the Age of Deep Learning and Quantum ComputationKAMAL CHOUDHARY
 
Recent neutrino oscillation results from T2K
Recent neutrino oscillation results from T2KRecent neutrino oscillation results from T2K
Recent neutrino oscillation results from T2KSon Cao
 
Nano generators 242
Nano generators 242Nano generators 242
Nano generators 242Justin Rams
 
Local Field Potential (LFP): Literature Review
Local Field Potential (LFP): Literature ReviewLocal Field Potential (LFP): Literature Review
Local Field Potential (LFP): Literature ReviewMd Kafiul Islam
 
Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...
Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...
Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...Paul Ahern
 
Solar Cells Lecture 2: Physics of Crystalline Solar Cells
Solar Cells Lecture 2: Physics of Crystalline Solar CellsSolar Cells Lecture 2: Physics of Crystalline Solar Cells
Solar Cells Lecture 2: Physics of Crystalline Solar CellsTuong Do
 
Interband and intraband electronic transition in quantum nanostructures
Interband and intraband  electronic transition in quantum nanostructuresInterband and intraband  electronic transition in quantum nanostructures
Interband and intraband electronic transition in quantum nanostructuresGandhimathi Muthuselvam
 
Nantennas/Nano antenna
Nantennas/Nano antenna Nantennas/Nano antenna
Nantennas/Nano antenna Nagarjuna P
 
Quantum transport in semiconductor nanostructures
Quantum transport in semiconductor nanostructuresQuantum transport in semiconductor nanostructures
Quantum transport in semiconductor nanostructuressamrat saurabh
 

Was ist angesagt? (20)

EEG Generators
EEG GeneratorsEEG Generators
EEG Generators
 
13_Properties of Nanomaterials.pptx
13_Properties of Nanomaterials.pptx13_Properties of Nanomaterials.pptx
13_Properties of Nanomaterials.pptx
 
Introduction to density functional theory
Introduction to density functional theory Introduction to density functional theory
Introduction to density functional theory
 
Highly efficient organic devices.
Highly efficient organic devices.Highly efficient organic devices.
Highly efficient organic devices.
 
Introduction to DFT Part 2
Introduction to DFT Part 2Introduction to DFT Part 2
Introduction to DFT Part 2
 
Junctionless Transistor
Junctionless Transistor Junctionless Transistor
Junctionless Transistor
 
Neuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkar
Neuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkarNeuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkar
Neuro physiology of seizures & eeg, pedaitric neurologist, dr amit vatkar
 
P-QRS-T peak detection of ECG signal by MATLAB
P-QRS-T peak detection of ECG signal by MATLABP-QRS-T peak detection of ECG signal by MATLAB
P-QRS-T peak detection of ECG signal by MATLAB
 
Piezoelectric nanogenerator
Piezoelectric nanogeneratorPiezoelectric nanogenerator
Piezoelectric nanogenerator
 
Materials Design in the Age of Deep Learning and Quantum Computation
Materials Design in the Age of Deep Learning and Quantum ComputationMaterials Design in the Age of Deep Learning and Quantum Computation
Materials Design in the Age of Deep Learning and Quantum Computation
 
Lecture 31 maxwell's equations. em waves.
Lecture 31   maxwell's equations. em waves.Lecture 31   maxwell's equations. em waves.
Lecture 31 maxwell's equations. em waves.
 
Recent neutrino oscillation results from T2K
Recent neutrino oscillation results from T2KRecent neutrino oscillation results from T2K
Recent neutrino oscillation results from T2K
 
Textbook of electroencephalography
Textbook of electroencephalographyTextbook of electroencephalography
Textbook of electroencephalography
 
Nano generators 242
Nano generators 242Nano generators 242
Nano generators 242
 
Local Field Potential (LFP): Literature Review
Local Field Potential (LFP): Literature ReviewLocal Field Potential (LFP): Literature Review
Local Field Potential (LFP): Literature Review
 
Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...
Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...
Paul Ahern - Time of Flight Secondary Ion Mass Spectroscopy [ToF-SIMS] theory...
 
Solar Cells Lecture 2: Physics of Crystalline Solar Cells
Solar Cells Lecture 2: Physics of Crystalline Solar CellsSolar Cells Lecture 2: Physics of Crystalline Solar Cells
Solar Cells Lecture 2: Physics of Crystalline Solar Cells
 
Interband and intraband electronic transition in quantum nanostructures
Interband and intraband  electronic transition in quantum nanostructuresInterband and intraband  electronic transition in quantum nanostructures
Interband and intraband electronic transition in quantum nanostructures
 
Nantennas/Nano antenna
Nantennas/Nano antenna Nantennas/Nano antenna
Nantennas/Nano antenna
 
Quantum transport in semiconductor nanostructures
Quantum transport in semiconductor nanostructuresQuantum transport in semiconductor nanostructures
Quantum transport in semiconductor nanostructures
 

Ähnlich wie 파이콘 한국 2020) 파이썬으로 구현하는 신경세포 기반의 인공 뇌 시뮬레이터

Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...Jason Tsai
 
Computational neuropharmacology drug designing
Computational neuropharmacology drug designingComputational neuropharmacology drug designing
Computational neuropharmacology drug designingRevathi Boyina
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANNMostafaHazemMostafaa
 
2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdf
2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdf2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdf
2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdfpierstanislaopaolucc1
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfAsst.prof M.Gokilavani
 
Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...
Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...
Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...InsideScientific
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisAdityendra Kumar Singh
 
Neural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptxNeural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptxisaac405396
 
Neural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptxNeural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptxisaac405396
 
Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...
Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...
Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...Lviv Startup Club
 
AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...
AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...
AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...GeeksLab Odessa
 
Neural Netwrok
Neural NetwrokNeural Netwrok
Neural NetwrokRabin BK
 
Annintro
AnnintroAnnintro
Annintroairsrch
 
NPTEL Deep Learning Lecture notes for session 2
NPTEL Deep Learning Lecture notes for session 2NPTEL Deep Learning Lecture notes for session 2
NPTEL Deep Learning Lecture notes for session 2maheswari285864
 
Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...
Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...
Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...InsideScientific
 
Blue Brain_Nikhilesh+Krishna Raj
Blue Brain_Nikhilesh+Krishna RajBlue Brain_Nikhilesh+Krishna Raj
Blue Brain_Nikhilesh+Krishna RajKrishna Raj .S
 

Ähnlich wie 파이콘 한국 2020) 파이썬으로 구현하는 신경세포 기반의 인공 뇌 시뮬레이터 (20)

Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
 
Computational neuropharmacology drug designing
Computational neuropharmacology drug designingComputational neuropharmacology drug designing
Computational neuropharmacology drug designing
 
Nencki321 day2
Nencki321 day2Nencki321 day2
Nencki321 day2
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANN
 
2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdf
2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdf2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdf
2023-1113e-INFN-Seminari-Paolucci-BioInspiredSpikingLearningSleepCycles.pdf
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
 
Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...
Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...
Studying Epilepsy in Awake Head-Fixed Mice Using Microscopy, Electrophysiolog...
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Neural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptxNeural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptx
 
Neural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptxNeural Network Presentation Draft Updated March.pptx
Neural Network Presentation Draft Updated March.pptx
 
Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...
Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...
Dimitri Nowicki - “Beyond McCulloch and Pitts: spiking network, deep learning...
 
AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...
AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...
AI&BigData Lab 2016. Дмитрий Новицкий: cпайковые и бионические нейронные сети...
 
Basics of Neural Networks
Basics of Neural NetworksBasics of Neural Networks
Basics of Neural Networks
 
Neural Netwrok
Neural NetwrokNeural Netwrok
Neural Netwrok
 
Annintro
AnnintroAnnintro
Annintro
 
NPTEL Deep Learning Lecture notes for session 2
NPTEL Deep Learning Lecture notes for session 2NPTEL Deep Learning Lecture notes for session 2
NPTEL Deep Learning Lecture notes for session 2
 
Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...
Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...
Making Optical and Electrophysiological Measurements in the Brain of Head-Fix...
 
annintro.ppt
annintro.pptannintro.ppt
annintro.ppt
 
Blue Brain
Blue BrainBlue Brain
Blue Brain
 
Blue Brain_Nikhilesh+Krishna Raj
Blue Brain_Nikhilesh+Krishna RajBlue Brain_Nikhilesh+Krishna Raj
Blue Brain_Nikhilesh+Krishna Raj
 

Mehr von Seonghyun Kim

코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구
코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구
코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구Seonghyun Kim
 
뇌의 정보처리와 멀티모달 인공지능
뇌의 정보처리와 멀티모달 인공지능뇌의 정보처리와 멀티모달 인공지능
뇌의 정보처리와 멀티모달 인공지능Seonghyun Kim
 
인공지능과 윤리
인공지능과 윤리인공지능과 윤리
인공지능과 윤리Seonghyun Kim
 
한국어 개체명 인식 과제에서의 의미 모호성 연구
한국어 개체명 인식 과제에서의 의미 모호성 연구한국어 개체명 인식 과제에서의 의미 모호성 연구
한국어 개체명 인식 과제에서의 의미 모호성 연구Seonghyun Kim
 
Backpropagation and the brain review
Backpropagation and the brain reviewBackpropagation and the brain review
Backpropagation and the brain reviewSeonghyun Kim
 
Theories of error back propagation in the brain review
Theories of error back propagation in the brain reviewTheories of error back propagation in the brain review
Theories of error back propagation in the brain reviewSeonghyun Kim
 
KorQuAD v1.0 참관기
KorQuAD v1.0 참관기KorQuAD v1.0 참관기
KorQuAD v1.0 참관기Seonghyun Kim
 
딥러닝 기반 자연어 언어모델 BERT
딥러닝 기반 자연어 언어모델 BERT딥러닝 기반 자연어 언어모델 BERT
딥러닝 기반 자연어 언어모델 BERTSeonghyun Kim
 
Enriching Word Vectors with Subword Information
Enriching Word Vectors with Subword InformationEnriching Word Vectors with Subword Information
Enriching Word Vectors with Subword InformationSeonghyun Kim
 
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingSeonghyun Kim
 
Korean-optimized Word Representations for Out of Vocabulary Problems caused b...
Korean-optimized Word Representations for Out of Vocabulary Problems caused b...Korean-optimized Word Representations for Out of Vocabulary Problems caused b...
Korean-optimized Word Representations for Out of Vocabulary Problems caused b...Seonghyun Kim
 
The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...
The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...
The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...Seonghyun Kim
 
Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...
Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...
Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...Seonghyun Kim
 
How Environment and Self-motion Combine in Neural Representations of Space
How Environment and Self-motion Combine in Neural Representations of SpaceHow Environment and Self-motion Combine in Neural Representations of Space
How Environment and Self-motion Combine in Neural Representations of SpaceSeonghyun Kim
 
Computational Cognitive Models of Spatial Memory in Navigation Space: A review
Computational Cognitive Models of Spatial Memory in Navigation Space: A reviewComputational Cognitive Models of Spatial Memory in Navigation Space: A review
Computational Cognitive Models of Spatial Memory in Navigation Space: A reviewSeonghyun Kim
 
Learning Anticipation via Spiking Networks: Application to Navigation Control
Learning Anticipation via Spiking Networks: Application to Navigation ControlLearning Anticipation via Spiking Networks: Application to Navigation Control
Learning Anticipation via Spiking Networks: Application to Navigation ControlSeonghyun Kim
 
A goal-directed spatial navigation model using forward trajectory planning ba...
A goal-directed spatial navigation model using forward trajectory planning ba...A goal-directed spatial navigation model using forward trajectory planning ba...
A goal-directed spatial navigation model using forward trajectory planning ba...Seonghyun Kim
 

Mehr von Seonghyun Kim (18)

코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구
코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구
코드 스위칭 코퍼스 기반 다국어 LLM의 지식 전이 연구
 
뇌의 정보처리와 멀티모달 인공지능
뇌의 정보처리와 멀티모달 인공지능뇌의 정보처리와 멀티모달 인공지능
뇌의 정보처리와 멀티모달 인공지능
 
인공지능과 윤리
인공지능과 윤리인공지능과 윤리
인공지능과 윤리
 
한국어 개체명 인식 과제에서의 의미 모호성 연구
한국어 개체명 인식 과제에서의 의미 모호성 연구한국어 개체명 인식 과제에서의 의미 모호성 연구
한국어 개체명 인식 과제에서의 의미 모호성 연구
 
Backpropagation and the brain review
Backpropagation and the brain reviewBackpropagation and the brain review
Backpropagation and the brain review
 
Theories of error back propagation in the brain review
Theories of error back propagation in the brain reviewTheories of error back propagation in the brain review
Theories of error back propagation in the brain review
 
KorQuAD v1.0 참관기
KorQuAD v1.0 참관기KorQuAD v1.0 참관기
KorQuAD v1.0 참관기
 
딥러닝 기반 자연어 언어모델 BERT
딥러닝 기반 자연어 언어모델 BERT딥러닝 기반 자연어 언어모델 BERT
딥러닝 기반 자연어 언어모델 BERT
 
Enriching Word Vectors with Subword Information
Enriching Word Vectors with Subword InformationEnriching Word Vectors with Subword Information
Enriching Word Vectors with Subword Information
 
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
 
Korean-optimized Word Representations for Out of Vocabulary Problems caused b...
Korean-optimized Word Representations for Out of Vocabulary Problems caused b...Korean-optimized Word Representations for Out of Vocabulary Problems caused b...
Korean-optimized Word Representations for Out of Vocabulary Problems caused b...
 
챗봇의 역사
챗봇의 역사챗봇의 역사
챗봇의 역사
 
The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...
The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...
The hippocampo-cortical loop: Spatio-temporal learning and goal-oriented plan...
 
Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...
Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...
Computational Properties of the Hippocampus Increase the Efficiency of Goal-D...
 
How Environment and Self-motion Combine in Neural Representations of Space
How Environment and Self-motion Combine in Neural Representations of SpaceHow Environment and Self-motion Combine in Neural Representations of Space
How Environment and Self-motion Combine in Neural Representations of Space
 
Computational Cognitive Models of Spatial Memory in Navigation Space: A review
Computational Cognitive Models of Spatial Memory in Navigation Space: A reviewComputational Cognitive Models of Spatial Memory in Navigation Space: A review
Computational Cognitive Models of Spatial Memory in Navigation Space: A review
 
Learning Anticipation via Spiking Networks: Application to Navigation Control
Learning Anticipation via Spiking Networks: Application to Navigation ControlLearning Anticipation via Spiking Networks: Application to Navigation Control
Learning Anticipation via Spiking Networks: Application to Navigation Control
 
A goal-directed spatial navigation model using forward trajectory planning ba...
A goal-directed spatial navigation model using forward trajectory planning ba...A goal-directed spatial navigation model using forward trajectory planning ba...
A goal-directed spatial navigation model using forward trajectory planning ba...
 

Kürzlich hochgeladen

Thyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate ProfessorThyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate Professormuralinath2
 
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...chandars293
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxseri bangash
 
Zoology 5th semester notes( Sumit_yadav).pdf
Zoology 5th semester notes( Sumit_yadav).pdfZoology 5th semester notes( Sumit_yadav).pdf
Zoology 5th semester notes( Sumit_yadav).pdfSumit Kumar yadav
 
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptxCOST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptxFarihaAbdulRasheed
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learninglevieagacer
 
chemical bonding Essentials of Physical Chemistry2.pdf
chemical bonding Essentials of Physical Chemistry2.pdfchemical bonding Essentials of Physical Chemistry2.pdf
chemical bonding Essentials of Physical Chemistry2.pdfTukamushabaBismark
 
GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)Areesha Ahmad
 
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑Damini Dixit
 
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.Silpa
 
SAMASTIPUR CALL GIRL 7857803690 LOW PRICE ESCORT SERVICE
SAMASTIPUR CALL GIRL 7857803690  LOW PRICE  ESCORT SERVICESAMASTIPUR CALL GIRL 7857803690  LOW PRICE  ESCORT SERVICE
SAMASTIPUR CALL GIRL 7857803690 LOW PRICE ESCORT SERVICEayushi9330
 
Introduction to Viruses
Introduction to VirusesIntroduction to Viruses
Introduction to VirusesAreesha Ahmad
 
GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)Areesha Ahmad
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsOrtegaSyrineMay
 
Pulmonary drug delivery system M.pharm -2nd sem P'ceutics
Pulmonary drug delivery system M.pharm -2nd sem P'ceuticsPulmonary drug delivery system M.pharm -2nd sem P'ceutics
Pulmonary drug delivery system M.pharm -2nd sem P'ceuticssakshisoni2385
 
pumpkin fruit fly, water melon fruit fly, cucumber fruit fly
pumpkin fruit fly, water melon fruit fly, cucumber fruit flypumpkin fruit fly, water melon fruit fly, cucumber fruit fly
pumpkin fruit fly, water melon fruit fly, cucumber fruit flyPRADYUMMAURYA1
 
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....muralinath2
 
development of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virusdevelopment of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virusNazaninKarimi6
 

Kürzlich hochgeladen (20)

Thyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate ProfessorThyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate Professor
 
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptx
 
Zoology 5th semester notes( Sumit_yadav).pdf
Zoology 5th semester notes( Sumit_yadav).pdfZoology 5th semester notes( Sumit_yadav).pdf
Zoology 5th semester notes( Sumit_yadav).pdf
 
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptxCOST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learning
 
Site Acceptance Test .
Site Acceptance Test                    .Site Acceptance Test                    .
Site Acceptance Test .
 
chemical bonding Essentials of Physical Chemistry2.pdf
chemical bonding Essentials of Physical Chemistry2.pdfchemical bonding Essentials of Physical Chemistry2.pdf
chemical bonding Essentials of Physical Chemistry2.pdf
 
GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
 
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
 
SAMASTIPUR CALL GIRL 7857803690 LOW PRICE ESCORT SERVICE
SAMASTIPUR CALL GIRL 7857803690  LOW PRICE  ESCORT SERVICESAMASTIPUR CALL GIRL 7857803690  LOW PRICE  ESCORT SERVICE
SAMASTIPUR CALL GIRL 7857803690 LOW PRICE ESCORT SERVICE
 
Introduction to Viruses
Introduction to VirusesIntroduction to Viruses
Introduction to Viruses
 
GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its Functions
 
Pulmonary drug delivery system M.pharm -2nd sem P'ceutics
Pulmonary drug delivery system M.pharm -2nd sem P'ceuticsPulmonary drug delivery system M.pharm -2nd sem P'ceutics
Pulmonary drug delivery system M.pharm -2nd sem P'ceutics
 
pumpkin fruit fly, water melon fruit fly, cucumber fruit fly
pumpkin fruit fly, water melon fruit fly, cucumber fruit flypumpkin fruit fly, water melon fruit fly, cucumber fruit fly
pumpkin fruit fly, water melon fruit fly, cucumber fruit fly
 
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
 
development of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virusdevelopment of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virus
 

파이콘 한국 2020) 파이썬으로 구현하는 신경세포 기반의 인공 뇌 시뮬레이터

  • 1. 파 썬으로 구현하는 신경세포 기반의 인공 뇌 시뮬레이터 김성현
  • 2. 저는 인간의 학습 원리와 최신 딥러닝 기술을 융합하여 AGI (Artificial general intelligence) 를 개발하는 꿈을 가지고 있습니다. ✉ bananaband657@gmail.com 🏠 https://banana-media-lab.tistory.com https://github.com/MrBananaHuman
  • 3. Introduction01 - Introduction to neuroscience Spiking Neural Network (SNN)02 - SNN as a neuromorphic neural network model Modeling of SNN03 - Python Nengo library for SNN modeling Applications of SNN04 - Deep SNN models Future of SNN05 - Neuromorphic chip
  • 5. Introduction The brain is the most complex 1.5 kg organ that controls all functions of the body, interprets information from the outside world, and embodies the essence of the mind and soul. Thoughts Perceptions Language Sensations Memories Actions Emotions Learning
  • 6. Introduction The brain is the most complex 1.5 kg organ that controls all functions of the body, interprets information from the outside world, and embodies the essence of the mind and soul. Thoughts Perceptions Language Sensations Memories Actions Emotions Learning
  • 7. History of Neuroscience - Neuron Neuroscience is the study of how the nervous system develops, its structure, and what it does. The first drawing of a neuron as the nerve cell (1865) [1] The first illustrated a synapse (1893, 1897) [2-3] [1] Otto Friedrich Karl Deiters, Deiters, 1865 [2] Sherrington CS, 1897, A textbook of physiology, London:Macmillian, p.1024-70 [3] Cajal R, 1893, Arch Anat Physiol Anat Abth., V & VI:310-428
  • 8. History of Neuroscience - Neuron A typical neuron consists of a cell body (soma), dendrites, and a single axon. [1] https://ib.bioninja.com.au/standard-level/topic-6-human-physiology/65-neurons-and-synapses/neurons.html Synapse Dendrite Nucleus Soma (Cell body) Axon terminal Myelin sheath Axon Synapse [1]
  • 9. History of Neuroscience – Action Potential An action potential is a rapid rise and subsequent fall in voltage or membrane potential across a cellular membrane with a characteristic pattern. [3] [1] [2] [1] How big is the GIANT Squid Giant Axon?, @TheCellularScale [2] Hodgkin AL & Huxley AF, 1945, J Physiol [3] https://www.moleculardevices.com/applications/patch-clamp-electrophysiology/what-action-potential#gref
  • 10. History of Neuroscience - Synapse Synapses are biological junctions through which neurons' signals can be sent to each other. [2] [1] https://synapseweb.clm.utexas.edu/type-1-synapse [2] Besson, P., 2017, Doctoral dissetation Presynaptic neuron Postsynaptic neuron Synpase [1] Excitatory postsynaptic potential (EPSP) Inhibitory postsynaptic potential (IPSP)
  • 11. History of Neuroscience - Synaptic Plasticity in Synapse Synaptic plasticity refers to the phenomenon whereby strength of synaptic connections between neurons changes over time. [1] M G LARRABEE, D W BRONK, 1947, J Neurophysiol. Presynaptic neuron Postsynaptic neuron Before stimulating After stimulating Action potentials recorded from the postganglionic nerve (1947) [1]
  • 12. History of Neuroscience - The Brain Neuron Synapse Plasticity Dendrite Nucleus Soma (cell body) Axon terminal Myelin sheath Axon • 86 Billion • 10–25 μm • > 1,000 types • 7,000 syn/neuron • 100-500 trillion • Potentiation • Depression [1] https://ib.bioninja.com.au/standard-level/topic-6-human-physiology/65-neurons-and-synapses/neurons.html [2] https://commons.wikimedia.org/w/index.php?curid=41349083 [3] https://sites.google.com/site/mcauliffeneur493/home/synaptic-plasticity [1] [2] [3]
  • 13. Artificial Neural Network (ANN) Revolution [1] ANN is abstract model that mimics the complex structure and functioning of the brain, which is developing explosively in recent years. [1] A brief history of neural nets and deep learning by A. Kurenkov
  • 14. Limitation of ANN Despite the success of the ANN algorithm, it has clear limitations. Computational limitations [1] Whittington and Bogacz, 2019, Trends in Cognitive Sciences [2] Grossberg, 1987, Cognitive Science [3] Lillicrap et al., 2020, Nature Review Neuroscience • Lack of local error representation → Vanishing gradient [1] • Symmetry of forwards and backwards weights → Weight transport problem [2] • Feedback in brains alters neural activity [3] • Unrealistic models of neurons → Large computational cost [1] • Error signals are singed and potentially extreme-valued → Over fitting [3]
  • 15. How Does The Brain Learn? [1] [1] Brainbow Hippocampus, Greg Dunn and Brian Edwards, 2014 [2] https://blogs.cardiff.ac.uk/acerringtonlab/ca1-pyramidal-neuron-red-hot/ [2]
  • 16. Spiking Neural Network (SNN) SNN as a neuromorphic neural network model 02
  • 17. Overview SNNs operate using spikes, which are discrete events that take place at points in time, rather than continuous values. [1] Anwani and Rajendran, 2015, IJCNN [1] Components • Spiking neuron model • Synapse • Synaptic plasticity
  • 18. Spiking Neuron Model - Leaky Integrate-and-Fire (LIF)Model A spiking neuron model is a mathematical description of the properties of certain cells in the nervous system that generate sharp electrical potentials across their cell membrane, roughly one millisecond in duration. [1] Teka, W. et al., 2014, PLoS Comput Biol. [Appendix 1] https://www.youtube.com/watch?v=2_MIjvwWsrg [Appendix 2] https://www.youtube.com/watch?v=KXnHxZdn8NU Characteristics • Subthreshold leaky- integrator dynamic • A firing threshold • Reset mechanism Resistor-Capacitor (RC) circuit [1]
  • 19. Spiking Neuron Model - Leaky Integrate-and-Fire (LIF)Model A spiking neuron model is a mathematical description of the properties of certain cells in the nervous system that generate sharp electrical potentials across their cell membrane, roughly one millisecond in duration. Characteristics • Subthreshold leaky- integrator dynamic • A firing threshold • Reset mechanism [1] Louis Lapicque, 1907, Journal de Physiologie et de Pathologie Générale. [Appendix 1] https://www.youtube.com/watch?v=2_MIjvwWsrg [Appendix 2] https://www.youtube.com/watch?v=KXnHxZdn8NU Leaky Integrate-and-Fire model [1]
  • 20. Synapse Model The synapse model activates as an input current stimulation to the spiking neuron model. [1] Dutta, S. et al., 2017, Scientific reports [1]
  • 21. Synaptic Plasticity - Learning in the Brain To reduce punishment To improve knowledge (reward(?)) ANN output target Error function Loss function Learning rate Error signal SNN • Unsupervised learning • Fire together, wire together • STDP learning • BCM learning • Supervised learning • Local error propagation • TP learning • PES learning [1] Timothy P. Lillicrap et al., 2020, Nat Rev Neurosci. [1] [1] [1]
  • 22. Unsupervised Learning - Spike Timing Dependent Plasticity (STDP) The Spike Timing Dependent Plasticity (STDP) algorithm, which has been observed in the mammalian brain, modulates the weight of a synapse based on the relative timing of presynaptic and postsynaptic spikes. [1-3] [1] Wang, R. et al., 2016, ISCAS [2] Gerstner et al., 1996, Nature [3] Bi and Poo, 1998, Journal of Neuroscience [2-3] [1] PostPre Spike Spike Time (ms) Δt
  • 23. Unsupervised Learning - Bienenstock, Cooper & Munro (BCM) The BCM model proposes a sliding threshold for long-term potentiation (LTP) or long-term depression (LTD) induction, and states that synaptic plasticity is stabilized by a dynamic adaptation of the time-averaged postsynaptic activity. [1] Bienenstock, Cooper & Munro 1982 J Neurosci Bienenstock, Cooper & Munro (BCM) learning [1] Learning in visual cortex BCM model
  • 24. Supervised Learning - Target Propagation (TP) output target Local layer-wise errors Hypothesis • The essential idea behind using a stack of auto-encoders for deep learning • This backward-propagated target induces hidden-activity targets that should have been realized by the network • Learning proceeds by updating the forward weights to minimize these local layer-wise activity differences Target propagation (TP) learning [1] [1] Timothy P. Lillicrap et al., 2020, Nat Rev Neurosci.
  • 25. Supervised Learning - Prescribed Error Sensitivity (PES) [1] [1] Timothy P. Lillicrap et al., 2020, Nat Rev Neurosci. [2] Voelker, A. R., 2015, Centre for Theoretical Neuroscience Prescribed Error Sensitivity (PES) learning [2] A connection from x to y learns to output y ∗ by minimizing |y ∗ − y|.
  • 26. Modeling of Spiking Neural Network (SNN) Python Nengo library for SNN modeling 03
  • 27. Nengo Library The Nengo Brain Maker is a Python package for building, testing, and deploying neural networks as a Neural Engineering Framework (NEF). [1] https://www.nengo.ai/ [1]
  • 28. Nengo Library The Nengo Brain Maker is a Python package for building, testing, and deploying neural networks as a Neural Engineering Framework (NEF). [1] https://www.nengo.ai/ [1]
  • 29. Nengo Tutorial Installation Usage Build a network !pip install nengo import nengo import numpy as np net = nengo.Network() with net: sin_input = nengo.Node(output=np.sin) input_neuron = nengo.Ensemble(n_neurons=4, dimensions=1) nengo.Connection(sin_input, input_neuron) Node (Sine)
  • 30. Spiking Neuron Model Characteristics import matplotlib.pyplot as plt %matplotlib inline from nengo.dists import Choice from nengo.utils.ensemble import tuning_curves from nengo.utils.matplotlib import rasterplot with nengo.Simulator(net) as sim: plt.figure() plt.plot(*tuning_curves(input_layer, sim)) plt.xlabel("input value") plt.ylabel("firing rate") plt.xlim(-1, 1) plt.title(str(nengo.LIF())) sim.run(5.0)
  • 31. Neural Dynamics intercepts=[-.5] intercepts=[0] intercepts=[.5] Input value Firingrate(Hz) Characteristics input_neuron = nengo.Ensemble(intercepts=[-.5])
  • 32. Neural Dynamics Characteristics input_neuron = nengo.Ensemble(intercepts=[0], encoders=[[-1]]) Input value Firingrate(Hz) intercepts=[-.5], encoders=[[-1]] intercepts=[0], encoders=[[-1]] intercepts=[.5], encoders=[[-1]]
  • 33. Neural Dynamics Characteristics input_neuron = nengo.Ensemble(intercepts=[0], encoders=[[-1]], max_rates=[100]) Input value Firingrate(Hz) max_rates=[10] max_rates=[100] max_rates=[200]
  • 34. Neural Dynamics Characteristics input_neuron = nengo.Ensemble(intercepts=[0], encoders=[[-1]], max_rates=[100], radius=1) Input value Firingrate(Hz) radius=1 radius=2 radius=10
  • 35. Neural Decoding Characteristics with net: sin_input = nengo.Node(np.sin) input_layer = nengo.Ensemble(n_neurons=2,dimensions=1, intercepts=[-.5, - .5], encoders=[[1], [-1]], max_rates = [100, 100]) nengo.Connection(sin_input, input_layer) Input value Firingrate(Hz)
  • 36. Neural Decoding Prober with net: sin_probe = nengo.Probe(sin_input) spikes = nengo.Probe(input_layer.neurons) filtered = nengo.Probe(input_layer, synapse=0.01) t = sim.trange() # Plot the spiking output of the ensemble plt.figure(figsize=(10, 8)) plt.subplot(2, 2, 1) rasterplot(t, sim.data[spikes], colors=[(1, 0, 0), (0, 0, 0)]) plt.yticks((1, 2), ("On neuron", "Off neuron")) plt.ylim(2.5, 0.5) # Plot the decoded output of the ensemble plt.figure() plt.plot(t, sim.data[filtered]) plt.plot(t, sim.data[sin_probe]) plt.xlim(0, 10)
  • 37. Neural Decoding Prober with net: sin_probe = nengo.Probe(sin_input) spikes = nengo.Probe(input_layer.neurons) filtered = nengo.Probe(input_layer, synapse=0.01) t = sim.trange() # Plot the spiking output of the ensemble plt.figure(figsize=(10, 8)) plt.subplot(2, 2, 1) rasterplot(t, sim.data[spikes], colors=[(1, 0, 0), (0, 0, 0)]) plt.yticks((1, 2), ("On neuron", "Off neuron")) plt.ylim(2.5, 0.5) # Plot the decoded output of the ensemble plt.figure() plt.plot(t, sim.data[filtered]) plt.plot(t, sim.data[sin_probe]) plt.xlim(0, 10)
  • 38. Neural Decoding Characteristics net = nengo.Network() with net: sin_input = nengo.Node(np.sin) input_layer = nengo.Ensemble(n_neurons=100,dimensions=1) nengo.Connection(sin_input, input_layer) Input value Firingrate(Hz) Time (s)Inputvalue
  • 39. Image Processing Input function custom urlretrieve("http://deeplearning.net/data/mnist/mnist.pkl.gz", "mnist.pkl.gz") with gzip.open("mnist.pkl.gz") as f: train_data, _, test_data = pickle.load(f, encoding="latin1") train_data = list(train_data) def image_input(t): # MNIST image data to Model img = train_data[0][int(t)] return img net = nengo.Network() neuron_number = 28*28 with net: input_node = nengo.Node(image_input) pre_neuron = nengo.Ensemble(neuron_number, dimensions=neuron_number, max_rates = [100] * neuron_number, intercepts=[0] * neuron_number) nengo.Connection(input_node, pre_neuron)
  • 40. Image Processing Input function custom urlretrieve("http://deeplearning.net/data/mnist/mnist.pkl.gz", "mnist.pkl.gz") with gzip.open("mnist.pkl.gz") as f: train_data, _, test_data = pickle.load(f, encoding="latin1") train_data = list(train_data) def image_input(t): # MNIST image data to Model img = train_data[0][int(t)] return img net = nengo.Network() neuron_number = 28*28 with net: input_node = nengo.Node(image_input) pre_neuron = nengo.Ensemble(neuron_number, dimensions=neuron_number, max_rates = [100] * neuron_number, intercepts=[0] * neuron_number) nengo.Connection(input_node, pre_neuron) Time (s) Neuronnumber(28*28)
  • 41. Voice Processing Input function custom def voice_input(t): ms = int(t * 1000) frame_num = int(ms / frame_size) voice = transposed_norm_S[frame_num] return voice with nengo.Network() as net: voice_input = nengo.Node(output=voice_input) input_neuron = nengo.Ensemble(n_neurons=80, dimensions=1, max_rates=([100] * neuron_number)) nengo.Connection(voice, input_neuron, synapse=0.01) spike_probe = nengo.Probe(input_neuron)
  • 42. Voice Processing Input function custom def voice_input(t): ms = int(t * 1000) frame_num = int(ms / frame_size) voice = transposed_norm_S[frame_num] return voice with nengo.Network() as net: voice_input = nengo.Node(output=voice_input) input_neuron = nengo.Ensemble(n_neurons=80, dimensions=1, max_rates=([100] * neuron_number)) nengo.Connection(voice, input_neuron, synapse=0.01) spike_probe = nengo.Probe(input_neuron) neurons
  • 43. Unsupervised Learning BCM learning net = nengo.Network() with net: sin = nengo.Node(lambda t: np.sin(t * 4)) pre = nengo.Ensemble(100, dimensions=1) post = nengo.Ensemble(100, dimensions=1) nengo.Connection(sin, pre) conn = nengo.Connection(pre, post) conn.learning_rule_type = nengo.BCM(learning_rate=5e-10)
  • 44. Supervised Learning Without PES learning with net: noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1) input_layer = nengo.Ensemble(60, dimensions=1) output_layer = nengo.Ensemble(60, dimensions=1) nengo.Connection(noise_input, input_layer) conn = nengo.Connection(input_layer, output_layer) with nengo.Simulator(model) as sim: sim.run(10.0) Node (Noise) Input Output
  • 45. Supervised Learning With PES learning with net: noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1) input_layer = nengo.Ensemble(60, dimensions=1) output_layer = nengo.Ensemble(60, dimensions=1) nengo.Connection(noise_input, input_layer) conn = nengo.Connection(input_layer, output_layer) error_neuron = nengo.Ensemble(60, dimensions=1) nengo.Connection(output_layer, error_neuron) nengo.Connection(input_layer, error_neuron, transform=-1) conn.learning_rule_type = nengo.PES() nengo.Connection(error_neuron, conn.learning_rule) with nengo.Simulator(model) as sim: sim.run(10.0) Node (Noise) Input Error Output -1
  • 46. Supervised Learning With PES learning with net: noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1) input_layer = nengo.Ensemble(60, dimensions=1) output_layer = nengo.Ensemble(60, dimensions=1) nengo.Connection(noise_input, input_layer) conn = nengo.Connection(input_layer, output_layer) error_neuron = nengo.Ensemble(60, dimensions=1) nengo.Connection(output_layer, error_neuron) nengo.Connection(input_layer, error_neuron, transform=-1) conn.learning_rule_type = nengo.PES() nengo.Connection(error_neuron, conn.learning_rule) with nengo.Simulator(model) as sim: sim.run(10.0)
  • 47. Supervised Learning With PES learning with net: noise_input = nengo.Node(WhiteSignal(60, high=5), size_out=1) input_layer = nengo.Ensemble(60, dimensions=1) output_layer = nengo.Ensemble(60, dimensions=1) nengo.Connection(noise_input, input_layer) conn = nengo.Connection(input_layer, output_layer) error_neuron = nengo.Ensemble(60, dimensions=1) nengo.Connection(output_layer, error_neuron) nengo.Connection(input_layer, error_neuron, transform=-1) conn.learning_rule_type = nengo.PES() nengo.Connection(error_neuron, conn.learning_rule) with nengo.Simulator(model) as sim: sim.run(10.0)
  • 48. Keras Model Converting [1] [1] https://towardsdatascience.com/mnist-handwritten-digits-classification-using-a-convolutional-neural-network-cnn-af5fafbc35e9
  • 49. Keras Model Converting MNIST model converting converter = nengo_dl.Converter(model, epochs=2, swap_activations={tf.nn.relu: nengo.RectifiedLinear()) with nengo_dl.Simulator(converter.net, seed=0, minibatch_size=200) as sim: sim.compile( optimizer=tf.optimizers.RMSprop(0.001), loss={ converter.outputs[dense1]: tf.losses.SparseCategoricalCrossentropy( from_logits=True ) }, metrics={converter.outputs[dense1]: tf.metrics.sparse_categorical_accuracy}, ) sim.fit( {converter.inputs[inp]: train_images}, {converter.outputs[dense1]: train_labels}, epochs=epochs, ) sim.save_params("./mnist_model")
  • 51. Applications of SNN Deep SNN models 04
  • 52. Solving XOR Problem It is known that the XOR problem cannot be solved with the traditional perceptron model but Nengo based SNN can solve the problem with only a single layer. [1] [2] [1] Gidon et al., 2020, Science [2] https://github.com/sunggukcha/xor [3] https://www.nengo.ai/examples/ [3]
  • 53. Permuted Sequential MNIST In the Permuted Sequential MNIST data containing the order information for writing numbers, the Nengo SNN-based (Legendre Memory Units) LMU showed SOTA performance. [2] [1] https://github.com/edwin-de-jong/mnist-digits-stroke-sequence-data/wiki/MNIST-digits-stroke-sequence-data [2] Coelker, A. et al., 2019, NIPS [3] https://www.nengo.ai/examples/ [2] [1] [3]
  • 54. Large Scale Virtual Brain Simulation Methods • Semantic Pointer Architecture Unified Network (SPAUN) • Using Nengo • 2.5 million LIF neurons • Success on 8 diverse tasks • Copy drawing style • Image recognition • Reinforcement learning • Serial working memory • Counting • Question Answering • Rapid variable creation • Fluid reasoning [1] Eliasmith et al., 2012, Science [1]
  • 55. Large Scale Virtual Brain Simulation Methods • Semantic Pointer Architecture Unified Network (SPAUN) • Using Nengo • 2.5 million LIF neurons • Success on 8 diverse tasks • Copy drawing style • Image recognition • Reinforcement learning • Serial working memory • Counting • Question Answering • Rapid variable creation • Fluid reasoning [1] Eliasmith et al., 2012, Science [1]
  • 57. Neuromorphic Advantages Advantages • Sparsification over time → Less communication • Less communication → Fewer memory lookups • Cheaper computation → Sum instead of multiply [1] Jeehyun Kwak and Hyun Jae Jang, Neural Computation Lab (NCL), Korea Univ. [1]
  • 58. Neuromorphic Advantages Neuromorphic Processing Unit [1] Eliasmith and Suma, The Neuromorphic Advantage, Applied Brain Research (ABR) [1] Intel Loihi Chip
  • 59. Neuromorphic Advantages [1] Eliasmith and Suma, The Neuromorphic Advantage, Applied Brain Research (ABR) [1]
  • 60. Neuromorphic Advantages [1] Eliasmith et al., 2016, arXiv [2] Jang, H. J. et al., 2020, Science Advances [1] [2] 3D neuron model
  • 61. Computational Neuroscience [1] Trappenberg, T. P., 2009, Fundamentals of computational neuroscience, OUP Oxford [1]
  • 62. 감사합니다 :-) SNN와 관련된 대화는 언제나 환영합니다. bananaband657@gmail.com