Schedule for: 17w5036 - Brain Dynamics and Statistics: Simulation versus Data
Beginning on Sunday, February 26 and ending Friday March 3, 2017
All times in Banff, Alberta time, MST (UTC-7).
Sunday, February 26 | |
---|---|
16:00 - 17:30 | Check-in begins at 16:00 on Sunday and is open 24 hours (Front Desk - Professional Development Centre) |
17:30 - 19:30 |
Dinner ↓ A buffet dinner is served daily between 5:30pm and 7:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
20:00 - 22:00 | Informal gathering (Corbett Hall Lounge (CH 2110)) |
Monday, February 27 | |
---|---|
07:00 - 08:45 |
Breakfast ↓ Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
08:45 - 09:00 | Introduction and Welcome by BIRS Station Manager (TCPL 201) |
09:00 - 10:00 |
Peter Thomas: Noise in the Brain: Statistical and Dynamical Perspectives ↓ There is growing interest in applying statistical estimation methods to
dynamical systems arising in neuroscience. The discipline of statistics
provides an intellectual framework for quantifying and managing uncertainty. For
statistical methods to apply, one must consider a system with some variability.
Depending on where one locates the source of variability, different methods
suggest themselves. The talk will discuss some challenges and opportunities in
linking statistical and mathematical perspectives in theoretical neuroscience,
including the problem of quantifying "phase resetting" in stochastic
oscillators, the problem of identifying the most significant sources of noise in
a finite state Markov model, the problem of inferring control mechanisms in
brain-body motor control systems, and the problem of parameter identification in
noisy conductance-based models. (TCPL 201) |
10:00 - 10:45 | Coffee Break (TCPL Foyer) |
10:45 - 11:45 |
Adeline Samson: Hypoelliptic stochastic FitzHugh-Nagumo neuronal model: mixing, up-crossing and estimation of the spike rate ↓ Joint work with J.R. Leon (Universidad Central de Venezuela).
In this presentation, we will introduce the hypoelliptic stochastic
neuronal model, as diffusion approximation of intra-cellular or
extra-cellular models. Then we will specifically focus on the
FitzHugh-Nagumo model, and detail some of its properties: stationary
distribution, definition of spikes, estimation of spiking rate. (TCPL 201) |
11:45 - 13:00 | Lunch (Vistas Dining Room) |
13:00 - 14:00 |
Guided Tour of The Banff Centre ↓ Meet in the Corbett Hall Lounge for a guided tour of The Banff Centre campus. (Corbett Hall Lounge (CH 2110)) |
14:00 - 14:20 |
Group Photo ↓ Meet in foyer of TCPL to participate in the BIRS group photo. The photograph will be taken outdoors, so dress appropriately for the weather. Please don't be late, or you might not be in the official group photo! (TCPL Foyer) |
14:20 - 14:50 |
Eva Löcherbach: Memory and hypoellipticity in neuronal models ↓ I will discuss the effect of memory in two classical models of neuroscience, first the stochastic Hodgkin-Huxley model, considered in a series of papers by R. Höpfner, M. Thieullen and myself, and second, a model of interacting Hawkes processes in high dimension which has been studied recently by S. Ditlevsen and myself. I will explain why memory leads to hypoellipticity and then discuss the probabilistic and statistical consequences of this fact. (TCPL 201) |
14:50 - 15:30 | Coffee Break (TCPL Foyer) |
15:30 - 16:00 |
Antoni Guillamon: At the crossroad between invariant manifolds and the role of noise ↓ I will drift along several topics in which we are currently involved. They all contain some treatment of recent tools from dynamical systems, mainly related to invariant manifolds, intertwined with the role of stochasticity. We aim at sharing our research interests to boost discussion and eventual collaborations. The first example will be on bistable perception where we are exploring up to which extent noise is necessary to explain data obtained from psycophysical experiments, as opposed to the information provided assuming quasi-periodic forcing. Second, we will focus on phase response curves (PRC) for oscillating neurons, a well-known tool to study the effectiveness of information transmission between cells; we have extended the notion of PRC to that of phase response field/function (PRF), also valid away from pure periodic behaviour. Stochastic PRCs have been already studied but PRFs constitute a more natural setting when neurons are subject to noisy inputs; we have neither explored the impact of noise on PRFs nor the extension of the concept to stochastic processes yet and, moreover, we also seek for experimental tests of our theoretical findings. Therefore, we think it can be interesting to address these open questions in this audience. If time allows, we will comment on some other issues (estimation of conductances and short-term synaptic depression) that combine concepts of dynamical systems with the presence of noise. (TCPL 201) |
16:00 - 16:04 | Catalina Vich: Poster presentation: Different strategies to estimate synaptic conductances (TCPL 201) |
16:05 - 16:09 | Timothy Whalen: Poster presentation: Pallidostriatal Projections Promote Beta Oscillations in a Biophysical Model of the Parkinsonian Basal Ganglia (TCPL 201) |
16:10 - 16:14 | Jacob Østergaard: Poster presentation: Capturing spike variability in noisy Izhikevich neurons using point process GLMs (TCPL 201) |
16:15 - 16:19 | Peter Rowat: Poster presentation: Stochastic network thinking applied to firing patterns of stellate neurons (TCPL 201) |
16:20 - 16:24 | Wilhelm Braun: Poster presentation: Spike-triggered neuronal adaptation as an iterated first-passage time problem (TCPL 201) |
17:30 - 19:30 |
Dinner ↓ A buffet dinner is served daily between 5:30pm and 7:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
Tuesday, February 28 | |
---|---|
07:00 - 09:00 | Breakfast (Vistas Dining Room) |
09:00 - 10:00 |
Zachary Kilpatrick: Maintaining spatial working memory across time in stochastic bump attractor models ↓ We discuss various network mechanisms capable of making spatial working memory more robust to noise perturbation and error. The canonical example we begin with arises from classic oculomotor delayed response tasks whereby a subject must maintain the memory of a location around a circle over the period of a few seconds. Asymptotic methods are used to reduce the dynamics of a bump attractor to a stochastic differential equation whose dynamics are governed by a potential that reflects spatial heterogeneity in the network connectivity. Heterogeneity can serve to reduces the degradation of memory overtime, ultimately increasing the transfer of information forward in time. We also show that connectivity between multiple layers of a working memory can further serve to stabilize memory, especially if they possess propagation delays. We conclude by discussing recent work, where we are modeling the phenomenon whereby a previous trial’s response “attracts” the current trial’s response, sometimes called repetition bias. (TCPL 201) |
10:00 - 10:45 | Coffee Break (TCPL Foyer) |
10:45 - 11:45 |
Mark McDonnell: What can we learn from deep-learning? Models and validation of neurobiological learning inspired by modern deep artificial neural networks ↓ In the field of machine learning, ‘deep-learning’ has become spectacularly successful very rapidly, and now frequently achieves better-than-human performance on difficult pattern recognition tasks. It seems that the decades-old theoretical potential of artificial neural networks (ANNs) is finally being realized. For computer vision problems, convolutional ANNs are used, and are often characterized as “biologically inspired.” This is due to the hierarchy of layers of nonlinear processing units and pooling stages, and learnt spatial filters resembling simple and complex cells.
However, this resemblance is superficial. An open challenge for computational neuroscience is to identify whether the spectacular success of deep-learning can offer insights for realistic models of neurobiological learning that are constrained by known anatomy and physiology. I will discuss this challenge and argue that we need to validate proposed neurobiological learning rules using challenging real data sets like those used in deep-learning, and ensure their learning capability is comparable to that of deep ANNs.
To illustrate this approach, in this talk I will show mathematically how a standard cost-function used for supervised training of ANNs can be decomposed into an unsupervised decorrelation stage and a supervised Hebbian-like stage. With this insight, I argue that this form of learning is feasible as a neurobiological learning mechanism in recurrently-connected layer 2/3 and layer 4 cortical neurons. I will further show that the model can learn to very effectively classify patterns (e.g. images of handwritten digits from the MNIST benchmark); error rates are comparable to state of the art deep-learning algorithms, i.e. less than 1%. (TCPL 201) |
11:45 - 13:30 | Lunch (Vistas Dining Room) |
13:35 - 13:39 | Mareile Grosse Ruse: Poster presentation: Modeling with Stochastic Differential Equations and Mixed Effects (TCPL 201) |
13:40 - 13:44 | Kang Li: Poster presentation: Mathematical neural models for visual attention (TCPL 201) |
13:45 - 13:49 | Pietro Quaglio: Poster presentation: SPADE: Spike Pattern Detection and Evaluation in Massively Parallel Spike Trains (TCPL 201) |
13:50 - 13:54 | Alexandre René: Poster presentation: Dimensionality reduction of stochastic differential equations with distributed delays (TCPL 201) |
13:55 - 13:59 | Roberto Fernández Galán: Poster presentation: Spherical harmonics reveal standing EEG waves and long-range neural synchronization in the sleeping brain (TCPL 201) |
14:00 - 14:30 |
Lawrence Ward: Pattern formation via stochastic neural field equations ↓ The formation of pattern in biological systems may be modeled by a set of reaction-diffusion equations. A form of diffusion-type coupling term biologically significant in neuroscience is a difference of Gaussian functions used as a space-convolution kernel. Here we study the simplest reaction-diffusion system with this type of coupling. Instead of the deterministic form of the model, we are interested in a \emph{stochastic} neural field equation, a space-time stochastic differential-integral equation. We explore, quantitatively, how the parameters of our model that measure the shape of the coupling kernel, coupling strength, and aspects of the spatially-smoothed space-time noise, control the pattern in the resulting evolving random field. We find that a spatial pattern that is damped in time in a deterministic system may be sustained and amplified by stochasticity, most strikingly at an optimal space-time noise level. (TCPL 201) |
14:30 - 15:15 | Coffee Break (TCPL Foyer) |
15:15 - 15:45 |
Laura Sacerdote: Integrate and Fire like models with stable distribution for the Interspike Intervals ↓ In 1964, Gernstein and Mandelbrot [1] proposed the Integrate and Fire model to account for the observed stable behavior of the Interspike Interval distribution. Their study of histograms of ISIs revealed the stable property and they suggested modeling the membrane potential through a Wiener process in order to get the inverse Gaussian as first passage time distribution, i.e. a stable distribution. Holden (1975) [2] observed that stable distributions determine a simple transmission pathway.
Later many variants of the original model appeared with the aim to improve its realism but meanwhile researches forgot the initial clue for the model. The Leaky Integrate and Fire model that has not stable FPT distribution gives an example. The same holds for many other variants of this model.
More recently Persi et al. (2004) [3] studying synchronization patterns, proposed a time non homogeneous integrate and fire model accounting for heavy tail distributions. The existence of heavy tails, typical of stable distributions is well recognized in the literature (see for example Tsubo et al [4] , Gal and Morom [5] and references cited therin)
Signals from different neurons are summed up during the elaboration. Different ISIs distributions would determine an incredible variety of firing distributions as the information progresses in the network. However, it seems unrealistic to admit ISIs that cannot reproduce the same distribution when summed. This suggest the development of Integrate and Fire models using stable distributions. Furthermore, the stable ISIs paradigm gives rise to a more robust transmission algorithm since a possible lack of detection of some spike from the surrounding neurons does not change the nature of the final distribution.
Here we rethink to the problem, taking advantage of the mathematical progresses on Lévy processes [6]. Hence, we propose to start the model formulation from the main property, i.e. the stable nature of the ISIs distribution. We follow the Integrate and Fire paradigm but we model the membrane potential through a randomized random walk whose jumps are separated by intertimes with stable distribution (or in the domain of attraction of such distribution). Observing that the supremum of the modeled membrane potential results to be the inverse of a stable subordinator allows to determine the Laplace transform of the ISIs distribution.
This is a preliminary contribution since we limit ourselves to some aspects of the modelling proposal ignoring any attempt to fit data. We are conscious that these are preliminary results and some further mathematical study will be necessary to obtain models fitting data. In particular we expect an important role of tempered stable distributions [7].
[1] Gerstein G.L., Mandelbrot B. (1964) Random walk models for the activity of a single neuron. Biophys. J. 4: 41-68.
[2] Holden, A.V. (1975) A Note on Convolution and Stable Distributions in the Nervous System. Biol. Cybern. 20: 171-173.
[3] Persi E., Horn D., Volman V., Segev R. and Ben-Jacob E. (2004) Modeling of Synchronized Bursting Events: the importance of Inhomogeneity. Neural Computation 16: 2577-2595.
[4] Tsubo Y., Isomura Y. and Fukai T. (2012) Power-Law Inter-Spike Interval Distributions infer a Conditional Maximization of Entropy in Cortical Neurons PLOS Computational Biology 8, 4: e1002461.
[5] Gal A., and Marom S. (2013) Entrainment of the Intrinsic Dynamics of Single Isola Neurons by Natural-Like Input The Journal of Neurosciences 33(18), 7912-7918.
[6] Kyprianou, A. (2014) Fluctuations of Lévy Processes with Applications. Springer Verlag, Berlin/Heidelberg.
[7] Rosinski, J. (2007) Tempering stable processes. Stochastic Processes and their Applications. 117, 6: 677-707 (TCPL 201) |
15:45 - 16:15 |
Jeremie Lefebvre: State-Dependent Control of Oscillatory Brain Dynamics ↓ Numerous studies have shown that periodic electrical stimulation can be used to not only to interfere with the activity of isolated neurons, but also to engage population-scale synchrony and collective rhythms. These findings have raised the fascinating prospect of manipulating emergent brain oscillations in a controlled manner, engaging neural circuits at a functional level to boost information processing, manipulate cognition and treat neurobiological disorders (so called “oscillopathies”). Capitalizing on this, it has been shown that brain stimulation can be tuned to alter perception and task performance. Rhythmic brain stimulation forms the basis of a control paradigm in which one can manipulate the intrinsic oscillatory properties of cortical networks via a plurality of input-driven mechanisms such as resonance, entrainment and non-linear acceleration. But the brain is not a passive receiver: outcomes of brain stimulation, either intracranial or non-invasive, are highly sensitive to ongoing brain dynamics, interfering and combining with internal fluctuations in non-trivial ways. Exogenous control on brain dynamics has indeed been shown to be gated by neural excitability, where effects of brain stimulation are both state-dependent and highly sensitive to stimulation parameters. To understand this phenomenon, we here used computational approach to study the role of ongoing state on the entrainment of cortical neurons. We examined whether state-dependent changes in thalamo-cortical dynamics could implement a gain control mechanism regulating cortical susceptibility to stimulation. We found that the resulting increase in irregular fluctuations during task states enables a greater susceptibility of cortical neurons to entrainment, and that this phenomenon can explained by a passage through a bifurcation combined to stochastic resonance. We also investigated the relationship between the stimulation parameters, such as amplitude and frequency, on entrainment regimes for different levels of sensory input. Taken together, our results provide new insights about the state-dependent interaction between rhythmic stimulation and cortical activity, accelerating the development of new paradigms to interrogate neural circuits and restore cognitive functions based on the selective manipulation of brain rhythms. (TCPL 201) |
17:30 - 19:30 | Dinner (Vistas Dining Room) |
Wednesday, March 1 | |
---|---|
07:00 - 09:00 | Breakfast (Vistas Dining Room) |
08:30 - 09:30 |
Massimiliano Tamborrino: Novel manifestation of the noise-aided signal enhancement ↓ We discuss various type of noise-aided signal enhancements.
Since not all stimulus levels can be decoded with the same accuracy, it is of paramount interest to determine which stimulus intensities can be discriminated most precisely. It is well known that the presence of noise corrupts signal transmission in linear systems. Nevertheless, noise may have a positive effect on signal processing in nonlinear systems, as confirmed by the stochastic resonance phenomenon. Stochastic resonance is typically observed in systems with a threshold in presence of a weak signal. However, the subthreshold regime is not a necessary condition when considering more than one neuron, since, for example, a suprathreshold signal may be also enhanced by noise in a network of threshold devices. Other phenomena where noise enhances the signal are for example coherence resonance and firing-rate resonance.
We present a study of the decoding accuracy of the stimulus level based on either the first-spike latency coding or the rate coding (from the exact spike counting distribution) in a neuronal model as simple as the perfect integrate-and-fire model. We report counter-intuitive results, representing a novel manifestation of the noise-aided signal enhancement which differs fundamentally from the usual kinds reported on. (TCPL 201) |
09:30 - 09:45 | Coffee Break (TCPL Foyer) |
09:45 - 10:45 |
Jonathan Touboul: Noise in large-scale neuronal networks, brain rhythms and neural avalanches ↓ It is now folklore that intracellular membrane potential recordings of neurons are highly noisy, owing to a variety of random microscopic processes contributing to maintenance. At macroscopic scales, reliable, fast and accurate responses to stimuli emerge, and are experimentally described through various quantities. I will focus in particular on mathematical models of large neuronal networks can account for the type of experimental data observed in synchronized oscillations ( sources of brain rhythms) or neural avalanches. These two quantities are relevant in that rhythms reportedly support important functions such as memory and attention, while distributions of avalanches were reported to reveal that the brain operates at criticality where it maximizes its information processing capacities.
I will show that a simple theory of large-scale dynamics allows understanding under a common framework both phenomena. In particular, I will show the relatively paradoxical phenomenon that noise can contribute to synchronization of large neural assemblies, and that the experimentally reported heavy-tailed distributions of avalanche durations and sizes may be in fact related to Boltzmann’s molecular chaos that naturally emerges in limits of large interacting networks with noise. These works were developed with Alain Destexhe and Bard Ermentrout. (TCPL 201) |
11:45 - 13:30 | Lunch (Vistas Dining Room) |
13:30 - 17:30 | Free Afternoon (Banff National Park) |
17:30 - 19:30 | Dinner (Vistas Dining Room) |
Thursday, March 2 | |
---|---|
07:00 - 09:00 | Breakfast (Vistas Dining Room) |
09:00 - 10:00 |
Benjamin Lindner: Spontaneous activity and information transmission in neural populations ↓ In my talk I will first review features of the spontaneous activity of nerve cells in neural populations and in recurrent networks, ranging from effects of cellular properties (e.g. noise and leak currents) and slow external noise (up/down transitions in a driving population) to the slow fluctuations that can build up due to the recurrent connectivity. I will then discuss how these features affect information transmission and stimulus detection, for instance, enable or suppress information filtering, benefit overall population coding, or lead to the detection of a short single cell stimulation in a large recurrent network. (TCPL 201) |
10:00 - 10:45 | Coffee Break (TCPL Foyer) |
10:45 - 11:45 |
Jonathan D. Victor: How high-order image statistics shape cortical visual processing ↓ Several decades of work have suggested that Barlow's principle of efficient coding is a powerful framework for understanding retinal design principles. Whether a similar notion extends to cortical visual processing is less clear, as there is no "bottleneck" comparable to the optic nerve, and much redundancy has already been removed. Here, we present convergent psychophysical and physiological evidence that regularities of high-order image statistics are indeed exploited by central visual processing, and at a surprising level of detail.
The starting point is a study of natural image statistics (Tkacic et al., 2010), in which we showed that high-order correlations in certain specific spatial configurations are informative, while high-order correlations in other spatial configurations are not: they can be accurately guessed from lower-order ones. We then construct artificial images (visual textures) composed either of informative or uninformative correlations. We find that informative high-order correlations are visually salient, while the uninformative correlations are nearly imperceptible. Physiological studies in macaque visual cortex identify the locus of the underlying computations. First, neuronal responses in macaque V1 and V2 mirror the psychophysical findings, in that many neurons respond differentially to the informative statistics, while few respond to the uninformative ones. Moreover, the differential responses largely arise in the supragranular layers, indicating that the computations are the result of intracortical processing.
We then consider low- and high-order local image statistics together, and apply a dimension-reduction (binarization) to cast them into a 10-dimensional space. We determine the perceptual isodiscrimination surfaces within this space. These are well-approximated by ellipsoids, and the principal axes of the ellipsoids correspond to the distribution of the local statistics in natural images. Interestingly, this correspondence differs in specific ways from the predictions of a model that implements efficient coding in an unrestricted manner. These deviations provide insights into the strategies that underlie the representation of image statistics. (TCPL 201) |
11:45 - 13:30 | Lunch (Vistas Dining Room) |
13:30 - 14:00 |
Rune W Berg: Neuronal population activity involved in motor patterns of the spinal cord: spiking regimes and skewed involvement ↓ Motor patterns such as chewing, breathing, walking and scratching are primarily produced by neuronal circuits within the brainstem or spinal cord. These activities are produced by concerted neuronal activity, but little is known about the degree of participation of the individual neurons. Here, we use multi-channel recording (256 channels) in turtles performing scratch motor pattern to investigate the distribution of spike rates across neurons. We found that the shape of the distribution is skewed and can be described as “log-normal”-like, i.e. normally shaped on logarithmic frequency-axis. Such distributions have been observed in other parts of the nervous system and been suggested to implicate a fluctuation driven regime (Roxin et al J. Neurosci. 2011). This is due to an expansive nonlinearity of the neuronal input-output function when the membrane potential is lurking in sub-threshold region. We further test this hypothesis by quantifying the irregularity of spiking across time and across the population as well as via intra-cellular recordings. We find that the population moves between supra- and sub-threshold regimes, but the largest fraction of neurons spent most time in the sub-threshold, i.e. fluctuation driven regime. Read more about this work here: Peter C Petersen, Rune W Berg "Lognormal firing rate distribution reveals prominent fluctuation–driven regime in spinal motor networks"
eLife 2016;5:e18805 (TCPL 201) |
14:00 - 14:30 |
Janet Best: Variability and regularity in neurotransmitter systems ↓ In this talk I will discuss a couple of recent themes in my work: (1) Population models of deterministic neural systems enable one to capture the biological variability in individuals. These models can explain how the same circuits can operate differently in different individuals or can change in time in individuals because of dynamic changes in gene expression. They can be used to discover the characteristics of subpopulations in which drugs are efficacious or deleterious and are therefore useful for clinical trials; (2) In volume transmission neurons don't engage in one on one transmission but project changes in biochemistry over long distances in the brain. Thus, the nuclei containing the cell bodies (like the dorsal raphe nucleus for serotonin) are basically acting as endocrine organs. Recent work with Sean Lawley shows why volume transmission works, that is why concentrations of neuromodulator in the extracellular space are very even despite the fact that terminals and varicosities are distributed unevenly and firing may be random. (TCPL 201) |
14:30 - 15:15 | Coffee Break (TCPL Foyer) |
15:15 - 15:45 |
Shigeru Shinomoto: Emergence of cascades in the linear and nonlinear Hawkes processes ↓ The self-exciting systems as represented by neural networks are known to exhibit catastrophic chain reaction if the internal interaction exceeds an epidemic threshold. Recently, we have shown that the same systems may exhibit nonstationary fluctuations already in the subthreshold regime [1,2]; cascades of events or spikes may emerge in the Hawkes process even in the absence of external forcing. In a practical situation, however, systems are subject to time-varying environment in addition to internal interaction between elements. Here we attempt to estimate the degree in which the nonstationary fluctuations occurring in the system are induced by external forcing or internal interaction.
[1] T. Onaga and S. Shinomoto, Bursting transition in a linear self-exciting point process. Physical Review E (2014) 89:042817.
[2] T. Onaga and S. Shinomoto, Emergence of event cascades in inhomogeneous networks. Scientific Reports (2016) 6:33321. (TCPL 201) |
15:45 - 16:15 |
Romain Veltz: Quasi-Synchronisation in a stochastic spiking neural network ↓ I will discuss a recent mean field (from E. Löcherbach and collaborators, R. Robert and J.Touboul) of a stochastic spiking neural network from a dynamical systems perspective. More precisely, I will present some recent results concerning the quasi-synchronisation of the neurons as function of the different parameters of the network (gap junction conductances, synaptic strength...). (TCPL 201) |
17:30 - 19:30 | Dinner (Vistas Dining Room) |
Friday, March 3 | |
---|---|
07:00 - 09:00 | Breakfast (Vistas Dining Room) |
09:00 - 10:00 |
Axel Hutt: Model and prediction of anaesthetic-induced EEG ↓ The monitoring of patients under general anaesthesia is an essential part during surgery. To interpret correctly measured brain activity, such as electroencephalogram (EEG), it is essential to first understand the possible origin of EEG and hence classify the physiological state of the patient correctly. Moreover, it would be advantageous to even predict the development of the
brain activity to anticipate severe changes of the physiological state. The talk presents a novel model explaining major EEG features under light anaesthesia, such as the spectral smile effect, by a denoising of brain dynamics. In a second part, the talk shows how to predict EEG under anaesthesia by applying a data assimilation technique. (TCPL 201) |
10:00 - 10:45 | Coffee Break (TCPL Foyer) |
10:45 - 11:15 |
Richard Naud: Burst ensemble multiplexing: connecting dendritic spikes with cortical inhibition ↓ Two distinct types of inputs impinge on different spatial compartments of pyramidal neurons of the neocortex. A popular view holds that the input impinging on the distal dendrites modulates the gain of the somatic input encoding. This gain modulation is thought to participate in top-down processes such as attention, sensory predictions and reward expectation. Here we use computational and theoretical analyses to determine how the two input streams are represented simultaneously in a neural ensemble. We find that dendritic calcium spikes in the distal dendrites allows multiplexing of the distal and somatic input streams by modifying the proportion of burst and singlet events. Two ensemble-average quantities encode the distal and somatic streams independently: the event rate and the burst probability, respectively. Simulations based on a two-compartment model reveal that this novel neural code can more than double the rate of information transfer over a large frequency bandwidth. To corroborate these findings, we determined analytically the parameters regulating mutual information in a point process model of bursting. Secondly, we find that an inhibitory microcircuitry combining short-term facilitation and short-term depression can decode the distal and somatic streams independently. These results suggest a novel functional role of both active dendrites and the stereotypical patterns with which inhibitory cell types interconnect in the neocortex. Burst ensemble multiplexing, we suggest, is a general code used by the neural system to flexibly combine two distinct streams of information. (TCPL 201) |
11:15 - 11:45 |
Volker Hofmann: Population coding in electric sensing: origin and function of noise correlations ↓ In many cases, great knowledge regarding single neuron activity during the encoding of sensory signals or the generation of behavioral outputs has been achieved. On another scale, however, we still lack detailed information of the mechanisms of concerted neuronal activity, i.e. population coding. This is crucial to understand the neuronal code and remains a central problem in neuroscience.
Extrapolating the knowledge of single unit activity to the scale of a neuronal population is often complicated by the fact that the activities of neurons are typically correlated rather than being independent. Such correlations, which can arise in terms of the average responses to different stimuli as well as in terms of trial to trial variability, were shown to substantially impact the efficacy of population codes.
To investigate the sources and function of correlations we use the weakly electric fish Apteronotus leptorhynchus as a model system, due to the wealth of physiological and anatomical knowledge that is available with regard to electrosensory processing. These fish sense electric fields with an array of electroreceptors that project to three parallel segments of the medullary electrosensory lateral line lobe (ELL). Previous studies established, that the size and the organization of receptive fields differs substantially between pyramidal neurons in the different segments, which should consequently result in very different amounts of correlations in each segment. In contrast to this, our experimental recordings revealed very similar levels of correlation magnitudes. To explain this surprising result, we investigated the differential receptive field interactions using a modeling approach. Considering the antagonistic center-surround organisation of receptive fields, we were able to show that very different receptive field organization can give rise to very similar amounts of correlations.
After establishing the presence of noise correlations in the ELL, we assessed their potential impact and function for the encoding of electrosensory stimuli. Our preliminary results suggest that ELL noise correlations encode stimuli independent of classical measures of neuronal activity (i.e. firing rate). The stimulus dependent changes in correlation levels are potentially modulated via the recurrent ELL connectivity which will be a focus of our future investigations to unravel the mechanisms mediating this independent additional channel of information transmission in the brain. (TCPL 201) |
11:30 - 12:00 |
Checkout by Noon ↓ 5-day workshop participants are welcome to use BIRS facilities (BIRS Coffee Lounge, TCPL and Reading Room) until 3 pm on Friday, although participants are still required to checkout of the guest rooms by 12 noon. (Front Desk - Professional Development Centre) |
12:00 - 13:30 | Lunch from 11:30 to 13:30 (Vistas Dining Room) |
13:30 - 15:00 | Discussion (TCPL 201) |