420 likes | 536 Views
NOISE and DELAYS in NEUROPHYSICS. Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine UNIVERSITY OF OTTAWA, Canada. OUTLINE. Modeling Single Neuron noise leaky integrate and fire
E N D
NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine UNIVERSITY OF OTTAWA, Canada
OUTLINE • Modeling Single Neuron noise leaky integrate and fire quadratic integrate and fire “transfer function” approach • Modeling response to signals • Information theory • Delayed dynamics
“Noise” in the neuroscience literature • As « internal », resulting from the probabilistic gating of voltage-dependent ion channels • As « synaptic », resulting from the stochastic nature of vesicle release at the synaptic cleft • As « cross-talk » responses from indirectly stimulated neurons • As the maintained discharge of some neurons • As an input with many frequency components over a particular band, of similar amplitudes, and scattered phases • As the resulting current from the integration of many independent, excitatory and inhibitory synaptic events at the soma Segundo et al., Origins and Self Organization, 1994
Leaky Integrate-and-fire with + and - Feedback f = firing rate function
Firing Rate Functions Noise free: Or stochastic:
Noise induced Stochastic Gain Control Resonance
For Poisson input (Campbell’s theorem): mean conductance ~ mean input ratestandard deviation σ ~ sqrt(mean rate)
WHAT QUADRATIC INTEGRATE-AND-FIRE MODEL? • Technically more difficult • Which variable to use? On the real line? On a circle?
Information-theoretic approaches • Linear encoding versus nonlinear processing • Rate code, long time constant , integrator • Time code, small time constant, coincidence detector (reliability) • Interspike interval code (ISI reconstruction) • Linear correlation coefficient • Coherence • Coding fraction • Mutual information • Challenge: Biophysics of coding • Forget the biophysics? Use better (mesoscopic ?) variables?
Neuroscience101 (Continued): Interspike Intervals (ISI): Spiketrain: Number of spikes In time interval T: Random variables Raster Plot:
Information Theoretic Calculations: ??? Gaussian Noise Stimulus S Neuron Spike Train X Coherence Function: Mutual Information Rate:
Stimulus Protocol: Study effect of (stimulus contrast) and fc (stimulus bandwidth) on coding.
Linear Response Calculation for Fourier transform of spike train: unperturbed spike train susceptibility Spike Train Spec = Background Spec + (transfer function*Signal Spec)
Wiener Khintchine Power spectrum Autocorrelation Integral of S over all frequencies = C(0) = signal variance Integral of C over all time lags = S(0) = signal intensity
Signal Detection Theory: ROC curve:
Actual signal Reconstructed signal Information Theory The stimulus can be well characterized (electric field). This allows for detailed signal processing analysis. Gabbiani et al., Nature (1996) 384:564-567. Bastian et al., J. Neurosci. (2002) 22:4577-4590. Krahe et al., (2002) J. Neurosci. 22:2374-2382.
Linear Stimulus Reconstruction • Estimate filter which, when convolved with the spike train, yields an estimated stimulated “closest” to real stimulus Spike train (zero mean) Estimated stimulus Mean square error Optimal Wiener filter
“stochastic resonance above threshold” Coding fraction versus noise intensity:
Modeling Electroreceptors: The Nelson Model (1996) High-Pass Filter Stochastic Spike Generator Input Spike generator assigns 0 or 1 spike per EOD cycle: multimodal histograms
Modeling Electroreceptors: The Extended LIFDT Model High-Pass Filter Input LIFDT Spike Train Parameters: without noise, receptor fires periodically (suprathreshold dynamics – no stochastic resonance)
Regularisation: Fano Factor: Asymptotic Limit (Cox and Lewis, 1966)
Higher Brain Sensory Input Sensory Neurons ELL Pyramidal Cell
Higher Brain Feedback: Open vs Closed Loop Architecture Higher Brain Loop time td
The ELL; first stage of sensory processing Higher Brain Areas Afferent Input Delayed Feedback Neural Networks
Andre’s data Jelte Bos’ data Longtin et al., Phys. Rev. A 41, 6992 (1990)
If one defines: one gets a Fokker-Planck equation: corresponding to the stochastic diff. eq. :
One can apply Ito or Stratonovich calculus, as for SDE’s. However, applicability is limited if there are complex eigenvalues or system is strongly nonlinear
TWO-STATE DESCRIPTION: S=±1 2 transition probabilities: For example, using Kramers approach:
DETERMINISTIC DELAYED BISTABILITY Stochastic approach does not yet get the whole picture!
Conclusions • NOISE: many sources, many approaches, exercise caution (Ito vs Strato) • INFORMATION THEORY: usually makes assumptions, and even when it doesn’t, ask the question whether next cell cares. • DELAYS: SDDE’s have no Fokker-Planck equivalent tomorrow: linear response-like theory
OUTLOOK • Second order field theory for stochastic neural dynamics with delays • Figuring out how intrinsic neuron dynamics (bursting, coincidence detection, etc…) interact with correlated input • Figuring out interaction of noise and bursting • Forget about steady state! • Whatever you do, think of the neural decoder…