1 / 62

NeuroComputing CENG569, Fall 2012-2013 1 st Lecture

NeuroComputing CENG569, Fall 2012-2013 1 st Lecture. Instructor: Erol Şahin. Overview. The neuron. A model of neural networks. X : activity tau : time constant f () : activation function b : external input W : synaptic strength. Spikes vs. rates. “Quantum” neurodynamics

pembroke
Download Presentation

NeuroComputing CENG569, Fall 2012-2013 1 st Lecture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NeuroComputingCENG569, Fall 2012-20131st Lecture Instructor: ErolŞahin

  2. Overview • The neuron

  3. A model of neural networks • X : activity • \tau : time constant • f() : activation function • b: external input • W : synaptic strength

  4. Spikesvs.rates • “Quantum” neurodynamics • neural activity is quantized • comes in packets called “spikes” • “Classical” neurodynamics • neural activity is continuously graded • described by rates

  5. A model neuron • Each synapse is a current source controlled by presynaptic spiking. • The dendrite adds the currents of multiple synapses. • The total current drives spiking in the axon.

  6. A synapse as a current source • Each action potential causes a synapse to inject a stereotyped current into the postsynaptic neuron. • Let the “synaptic strength” W denote the total charge injected by the synapse due to a single presynaptic spike. • Decaying exponential model

  7. Temporal summation • Approximation: currents from successive spikes add linearly.

  8. Currents of divergent synapses share the same time course.

  9. Normalized current • Assumption: all synapses have the same time constant \tau

  10. “Quantum” vs. “classical”

  11. Leakyintegrator model • If there is a spike, • Otherwise, exponential decay with time constant \tau • Normalized current is a low-pass filtered version of the presynaptic spike train.

  12. Firing rate approximation • Normalized current is a low-pass filtered version of the firing rate. • Accurate if rate is large compared to 1/\tau

  13. Summation of currents from convergent synapses • Approximation: linear superposition of currents from different synapses onto the same postsynaptic neuron.

  14. Spike generation • Regard the soma as a device that transduces current into rate of action potential firing. • The detailed dynamics of action potential generation (voltage, etc.) is implicit.

  15. Somatic current injection • Simulation of synaptic current

  16. Repetitive spiking • Pyramidal neurons in rat sensorimotorcortex • Schwindt, O’Brien, and Crill. J. Neurophysiol. 77:2484 (1997)

  17. F-I curve Schwindt, O’Brien, and Crill. J. Neurophysiol. 77:2484 (1997)

  18. Putting it all together

  19. Biophysical interpretation • f is firing rate as a function of current • x is normalized synaptic current (measured in Hz) • \tau is synaptic time constant • W is the charge/presynaptic spike

  20. Activation functions

  21. (Half-wave) rectification • Threshold for firing. • Rises smoothly from zero. • No saturation. • Popular for modeling the visual system. • “Semilinear”: linear concepts applicable.

  22. Step function • Jumps to nonzero value at threshold. • No increase in rate thereafter. • Computation with binary variables. • Popular with computer scientists.

  23. Sigmoidal function • Compromise • Binary for large inputs. • Linear for small inputs. • “S-shaped” • Logistic function • 0 to 1 • Hyperbolic tangent • -1 to 1

  24. Rates vs. spikes • When can the quantization of light be neglected? • “lots of photons” and little synchrony • When can the quantization of neural activity be neglected? • “lots of spikes” and little synchrony

  25. Deficiencies of the rate model • Soma • Spike frequency adaptation • Dendrite • Passive nonlinearities • Active nonlinearities • Short-term synaptic plasticity

  26. Spike frequency adaptation Schwindt, O’Brien, and Crill. J. Neurophysiol. 77:2484 (1997)

  27. Short term depression Tsodyks & Markram, PNAS 94:719 (1997)

  28. Linear threshold neurons

  29. Significance of the LT neuron • Model of neuron as a decision maker • weigh the evidence • compare with a threshold • Used alone: popular pattern classifier. • Used as a building block: • complex networks • e.g., multilayer perceptrons.

  30. Boolean functions • The output variable and N input variables take on binary values • 1 = true • 0 = false • What Boolean functions can be realized by an LT neuron?

  31. Excitatory input • Suppose that w_i=1 for all I • It doesn’t matter which inputs are active • only how many are active.

  32. Logical AND • “conjunction”

  33. Logical OR • “disjunction”

  34. Selectivity is controlled by threshold • AND • highly selective • high threshold • OR • Indiscriminate • low threshold

  35. Inhibition as a dynamic threshold • Suppose that w_i=±1 for all I • Then what matters is the number of active excitatory inputs minus the number of active inhibitory inputs

  36. Inhibition as negation • Nonmonotoneconjunction

  37. Claim: • Conjunctions and disjunctions of N variables or their negations can be realized by an LT neuron.

  38. Vector notation • weight vector w = (w_1,w_2,… w_N) • input vector x= (x_1,x_2,…,x_N) • threshold θ • inner/scalarproduct

  39. Separating hyperplane • w and θdefinea hyperplane that divides the input space into halfspaces • This hyperplaneis sometimes called the “decision boundary.”

  40. Linear separability

  41. Boolean functions

  42. The LT neuron cannot represent XOR.

  43. Multi-Layered networks (MLPs) • Two layers of LT neurons • (three layers if input neurons are included) • Two layers of synapses. • Feed-forward • No loops

  44. Two questions about representational power • What can an MLP compute? • any boolean function • What can an MLP compute efficiently? • a vanishingly small fraction of booleanfunctions

  45. Every boolean function corresponds to a truth table

  46. Some questions • How large is the set of boolean functions of N variables? • How large is the set of boolean functions realized by an LT neuron?

  47. Some questions • How large is the set of boolean functions of N variables? • How large is the set of boolean functions realized by an LT neuron? • Almost all booleanfunctions cannot be computed by an LT neuron • But… any Boolean function can be computed by a perceptron with two layers of synapses.

  48. Disjunctive normal form (DNF) • Any boolean function can be written in disjunctivenormal form. • Disjunction of conjunctions

  49. DNF construction

  50. Any DNF can be written as a 2-layer perceptron • AND of variables or their negations • LT neuron with excitatory and inhibitory synapses. • OR • LT neuron with excitatory synapses and low threshold.

More Related