CNS*2019 Workshop on

Methods of Information Theory in Computational Neuroscience

July 16-17, 2019, Barcelona, Spain


Methods originally developed in Information Theory have found wide applicability in computational neuroscience. Beyond these original methods there is a need to develop novel tools and approaches that are driven by problems arising in neuroscience.

A number of researchers in computational/systems neuroscience and in information/communication theory are investigating problems of information representation and processing. While the goals are often the same, these researchers bring different perspectives and points of view to a common set of neuroscience problems. Often they participate in different fora and their interaction is limited.

The goal of the workshop is to bring some of these researchers together to discuss challenges posed by neuroscience and to exchange ideas and present their latest work.

The workshop is targeted towards computational and systems neuroscientists with interest in methods of information theory as well as information/communication theorists with interest in neuroscience.

Organising Committee

  • Lubomir Kostal (chair), Academy of Sciences of the Czech Republic,
  • Joseph T. Lizier, The University of Sydney,
  • Viola Priesemann, Max Planck Institute for Dynamics and Self-organisation,
  • Justin Dauwels, Nanyang Technological University,
  • Taro Toyoizumi, RIKEN Center for Brain Science,
  • Michael Wibral, Goethe University, Frankfurt,
  • Adria Tauste, BarcelonaBeta Brain Research Center,

  • Location and Registration

    The workshop is a part of the wider CNS*2019 meeting in Barcelona, Spain. Please see the CNS*2019 website for registration to the workshops (this is required to attend).

    The workshop will take place in the Paranymph Auditorium in the Historical Building of the Universitat de Barcelona. The room has both a projector and a presentation computer. The staff running the conference center wishes to load the presentations onto their computer (from the speaker's USB drives) whenever possible. For presentations that require the use of one's own computer, a VGA/HDMI connection will be made available.


    Julijana Gjorgjieva, Max Planck Institute for Brain Research, Frankfurt
    Takyua Isomura, RIKEN Center for Brain Science
    Renaud Jolivet, University of Geneva
    Lubomir Kostal, Academy of Sciences of the Czech Republic
    David Kappel, Georg-August University, Goettingen
    Wiktor Mlynarski, Institute of Science and Technology Austria, Vienna
    Adria Tauste, BarcelonaBeta Brain Research Center
    Raoul Vicente, Institute of Computer Science, University of Tartu
    Michael Wibral, Georg-August University, Goettingen


    We thank the Entropy journal for sponsoring our Best Presentation Award for Early Career Researchers, which we awarded to:

    Wiktor Mlynarski, Institute of Science and Technology Austria, Vienna, for the talk "Adaptability and efficiency in neural coding".


    The workshop consist of three half-day sessions: (1) morning of July 16, (2) morning and (3) afternoon of July 17. The format is 45 minutes for long talks and 30 minutes for short talks, including the discussion.

    Tuesday, July 16 Wednesday, July 17
    Chair: Lubomir Kostal Chair: Lubomir Kostal
    09:30-10:15 Michael Wibral
    Georg-August University, Goettingen
    "Applying point-wise partial information decomposition to stimulus representations in prefrontal cortex"
    Adria Tauste
    BarcelonaBeta Brain Research Center
    "Relating neural coding and communication: Evidences from thalamo-cortical spike-train data during stimulus perception"
    10:15-11:00 Renaud Jolivet
    University of Geneva
    "Combining information theory and energetics into a coherent framework to study the brain's heterocellular circuits"
    Wiktor Mlynarski
    Institute of Science and Technology Austria, Vienna

    "Adaptability and efficiency in neural coding"
    11:00-11:30 Coffee break Coffee break 11:00-11:30
    Chair: Taro Toyoizumi Chair: Michael Wibral
    11:30-12:15 Leonidas Richter
    Max Planck Institute for Brain Research, Frankfurt
    "Functional diversity among sensory neurons from efficient coding principles"
    David Kappel
    Georg-August University, Goettingen
    "An information theoretic account of sequence learning: From prediction errors to transfer entropy"
    12:15-13:00 Takyua Isomura
    RIKEN Center for Brain Science
    Raoul Vicente
    Institute of Computer Science, University of Tartu
    "Using information theoretic functionals to guide deep reinforcement learning agents"
    Lunch break 13:00-14:30
    Chair: Lubomir Kostal
    David Shorten
    The University of Sydney
    "Estimation of Transfer Entropy for Spike Trains in Continuous-Time"
    Jaroslav Hlinka
    Institute of Computer Science CAS, Prague
    "Network Inference and Maximum Entropy Estimation on Information Diagrams"
    Praveen Venkatesh
    Carnegie Mellon University
    "Revealing Information Paths in the Brain using Synergistic Information"
    Coffee break 16:00-16:30
    Chair: Adria Tauste
    Rodrigo Cofre Torres
    University of Valparaiso
    "Exploring information-theoretic high-order effects of LSD in a Whole-Brain Model"
    Massimiliano Zanin
    Technical University of Madrid
    "Time irreversibility of resting brain activity in the healthy brain and pathology"
    Lubomir Kostal
    Institute of Physiology CAS, Prague
    "Critical size of neural population for reliable information transmission"


    Michael Wibral ( "Applying point-wise partial information decomposition to stimulus representations in prefrontal cortex"

    Renaud Jolivet ( "Combining information theory and energetics into a coherent framework to study the brain's heterocellular circuits"
    The brain consumes an inordinate amount of energy for its mass, and consists of an intricate network of neurones, vasculature and glial cells. I will discuss how energetic considerations can be combined to information theory to ask questions about how the brain balances the competing needs to save energy while retaining reasonable performance. In particular, I will show how energetic considerations and information theory can be combined to explain certain synaptic features. I will then discuss how these ideas can be expanded to the network level. Finally, I will discuss how brain energetics and information theory can be combined to develop a computational framework for the brain's heterocellular circuits.

    Julijana Gjorgjieva ( and Leonidas Richter ( "Functional diversity among sensory neurons from efficient coding principles"
    In many sensory systems the neural signal is coded by the coordinated response of heterogeneous populations of neurons. What computational benefit does this diversity confer on information processing? We derive an efficient coding framework assuming that neurons have evolved to communicate signals optimally given natural stimulus statistics and metabolic constraints. Incorporating nonlinearities and realistic noise, we study optimal population coding of the same sensory variable using two measures: maximizing the mutual information between stimuli and responses, and minimizing the error incurred by the optimal linear decoder of responses. Our theory is applied to a commonly observed splitting of sensory neurons into ON and OFF that signal stimulus increases or decreases, and to populations of monotonically increasing responses of the same type. Depending on the optimality measure, we make different predictions about how to optimally split a population into ON and OFF, and how to allocate the firing thresholds of individual neurons given realistic stimulus distributions and noise, which accord with certain biases observed experimentally.

    Takyua Isomura ( TBA

    Adria Tauste ( "Relating neural coding and communication: Evidences from thalamo-cortical spike-train data during stimulus perception"
    The problems of neural coding and neural communication have been frequently studied adopting distinct models and measures. For instance, neural population codes are typically studied through the decomposition of zero-lag information measures (e.g. mutual information, fisher information) across stimulus features and neural responses using multiple-trial estimations. In contrast, the inference of neural information flows is usually restricted to single-trial estimations of directed information measures and may consider the existence of delays and confounding neural responses, among other parameters. However, given known information theory models, it is plausible that both processes are intermingled when independently assessing neural coding properties or information flows from noisy measurements of spike-train data. Hence, can we find an unified framework where both processes are jointly analyzed? Recent findings from our analysis of simultaneously recorded neurons in somatosensory thalamus and cortex while monkeys perceived a tactile stimuli revealed intrinsic properties of intra- and inter-area information measures that could serve to this purpose.

    Wiktor Mlynarski ( "Adaptability and efficiency in neural coding"
    The ability to dynamically adapt to changes in stimulus statistics is one of the defining features of sensory systems. However, despite a long history of experimental research, it is not known how to characterize the dynamics of adaptation from an information processing perspective.  Here we build on ideas of rate-distortion theory and dynamic Bayesian inference to introduce a theoretical framework for optimizing and analyzing the temporal structure of adaptation in sensory codes. We show that sensory codes optimized for performing task-relevant computations are typically different from codes optimized for adapting to changes in the stimulus distributions that underlie these computations. These differences are manifested in the speed of adaptation, the accuracy of the code during periods of adaptation, and the accuracy in the adapted state. We derive encoding schemes that interpolate between task performance and adaptability, and can leverage the advantages of both objectives. Our results provide a unifying perspective on adaptation across a range of sensory systems, environments, and sensory tasks.

    David Kappel ( "An information theoretic account of sequence learning: From prediction errors to transfer entropy"
    Experimental data and theoretical considerations suggest that assembly sequences - the chaining of strongly interconnected groups of active neurons - play an important role in cognitive processes such as memory recall, decision making and planning. However, the mechanisms that allow neural networks to form and maintain assembly sequences without supervision are not fully understood. In this presentations, I will show early results on our recent efforts to analyze synaptic plasticity mechanisms for spike sequence learning and assembly sequence formation. From the goal to minimize the prediction error of future high-dimensional input sequences, we derive learning rules for a recurrent spiking network model. The emerging learning rules are local, resemble experimentally found plasticity mechanisms and promote the formation of stable neural assembly sequences that become active in synchrony with afferent inputs. More precisely, we show that the recurrent neural network self-organizes to encode salient features of task-relevant variables represented in the input spike train. We then analyze the learning rules for prediction error minimization using information theoretic tools and establish a close link to maximizing the transfer entropy in the network. In summary, our recent results provides new insights to the mechanisms that enable stable assembly sequence formation in spiking networks.

    Raoul Vicente ( "Using information theoretic functionals to guide deep reinforcement learning agents"

    David Shorten ( "Estimation of Transfer Entropy for Spike Trains in Continuous-Time"
    Transfer entropy is an information-theoretic measure of the directed flow of information between system components. Previous applications of transfer entropy to spike train data have discretised the spike trains into time bins. This approach has two substantial drawbacks: the resulting estimate of the transfer entropy will depend on the width of these time bins and it leads to an explosion in the dimensionality of the estimation problem. Recent work [1] has derived a continuous-time formalism for transfer entropy. It was shown that, for spike trains, this quantity can be calculated by estimating the logarithm of the instantaneous spike rate, conditioned on the target history alone versus both the target and source histories. This talk will report on the development of estimators of transfer entropy which utilise this continuous-time formalism. Two estimators for the log spike rates were derived. Similar to the popular KSG estimator, these estimators make use of K-nearest-neighbour searches. These searches are done over history embeddings of either the source spike train alone or both the source and target spike trains, depending on the type of conditioning required. A substantial challenge was the removal of biases in these rate estimations caused by the nearest-neighbour searches. Early numerical tests have demonstrated that these estimators are unbiased for spike trains with a constant rate. We will demonstrate their effectiveness on spike trains with varying rates, as well as testing their bias in the computation of final transfer entropy values
    [1] Spinney, R.E., Prokopenko, M. and Lizier, J.T., 2017. Transfer entropy in continuous time, with applications to jump and neural spiking processes. Physical Review E, 95(3), p.032319

    Jaroslav Hlinka ( "Network Inference and Maximum Entropy Estimation on Information Diagrams"
    Maximum entropy estimation is of broad interest for inferring properties of systems across many disciplines. Using a recently introduced technique for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies, we show how this can be used to estimate the direct network connectivity between interacting units from observed activity. As a generic example, we consider phase oscillators and show that our approach is typically superior to simply using the mutual information. In addition, we propose a nonparametric formulation of connected informations, used to test the explanatory power of a network description in general. We give an illustrative example showing how this agrees with the existing parametric formulation, and demonstrate its applicability and advantages for resting-state human brain networks, for which we also discuss its direct effective connectivity. Finally, we generalize to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish significant advantages of this approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases.

    Praveen Venkatesh ( "Revealing Information Paths in the Brain using Synergistic Information"
    Tracking flows of information is a problem of immense interest in neuroscience, which several statistical tools claim to provide a solution for. At its core, the information flow problem concerns the interaction of multiple variables---a "message" whose information content we are interested in, and the various transmissions in a computational circuit---making it a prime target for information-theoretic tools based on the Partial Information Decomposition (PID) literature. However, despite the close relationship between PID and neuroscience, measures based on synergistic information have not yet gained traction in revealing and tracking flows of information in neuroscientific experiments. In this work, we arrive at a formal definition for information flow that naturally integrates the idea of synergy, by searching for definitions which satisfy an intuitive property: namely, that the information flow of a message of interest should follow an unbroken path from source to sink in a computational network. The aforementioned message of interest may be, for instance, contained in the *stimulus* of an event-related experimental paradigm. This formal link allows us to demonstrate, through counterexamples, that many existing tools---including Granger Causality, Transfer Entropy and Directed Information---may provide poor insights on flows of information, even in idealized settings with noiseless measurements and no latent variables. Our framework based on PID, on the other hand, can be used to correctly reveal a more fine-grained understanding of information flow, and of the relationships between transmissions on different links. We also use this framework to formally prove an information-path theorem, which can be used to estimate the path along which stimulus-related information flows. Finally, we discuss how our framework can guide the experimentalist, both in designing the experiment, and by providing consistency checks for detecting the presence of relevant hidden nodes.
    A manuscript detailing this work is available online: Venkatesh, Dutta and Grover, "Information Flow in Computational Systems", arXiv:1902.02292 [cs.IT]. In addition, a condensed version of this work has been accepted to the International Symposium on Information Theory (ISIT), 2019.

    Rodrigo Cofre Torres ( "Exploring information-theoretic high-order effects of LSD in a Whole-Brain Model"
    What allows the brain to be more than the sum of their parts is not in the nature of the parts, but in the structure of their interdependencies. High-order interdependencies are increasingly being used in computational neurosciences to characterize interactions between groups variables, often with an emphasis on synergistic and redundant interactions. A promising novel information-theoretic tool has been proposed recently called "O-information" [1]. This function is the first symmetric quantity that can give an account of intrinsic statistical synergy in systems of more than three variables, allowing to asses high order interdependencies. The O-information captures the dominant characteristic of multivariate interdependency, distinguishing redundancy-dominated scenarios where three or more variables have copies of the same information, and synergy-dominated systems characterized by high-order patterns that cannot be traced from low-order marginals. In this talk, I will report on recent progress in quantifying multivariate interdependency in the BOLD signals generated using the Whole-Brain Multimodal Neuroimaging Model proposed in [2] under two scenarios. First, without serotonin neuromodulation (placebo condition) and including the neuromodulation to mimic the LSD condition. We discuss our results in the context of the "entropic brain" hypothesis [3], which states that the richness of experience reported during the LSD and other psychedelics condition correlates with signal diversity which can be characterized by the entropy of the signals. We argue that the "richness of content" expected in brain signals generated under the psychedelic experience can be characterized through high-order interdependencies among brain modules by means of the O-information.
    This work is done in collaboration with Ruben Herzog (U.V), Fernando Rosas and Pedro A.M Mediano (Imperial College London).
    [1] Rosas F, Mediano P.A.M, Gastpar M and Jensen H.J. Quantifying high-order interdependencies via multivariate extensions of the mutual information. (Accepted for publication in Physical Review E, ArXiv:1902.11239v1).
    [2] Deco G, Cruzat J, Cabral J, Knudsen GM, Carhart-Harris RL, Whybrow PC, Logothetis NK, Kringelbach ML. Whole-Brain Multimodal Neuroimaging Model Using Serotonin Receptor Maps Explains Non-linear Functional Effects of LSD. Curr Biol., 28(19), 2018.
    [3] Carhart-Harris RL, Leech R, Hellyer PJ, Shanahan M, Feilding A, Tagliazucchi E, Chialvo DR & Nutt, D 'The entropic brain: a theory of conscious states informed by neuroimaging research with psychedelic drugs', Frontiers in human neuroscience, 8(20), 2014.

    Massimiliano Zanin ( "Time irreversibility of resting brain activity in the healthy brain and pathology"
    Irreversibility is the property of some out-of-equilibrium systems, according to which one or more statistical properties are not symmetric under the operation of time reversal. Irreversibility is usually a landmark of the presence of memory, or alternatively of the thermodynamic entropy production of the system. We here propose to study the irreversibility of brain activity through a novel metric based on the appearance frequency of permutation patterns. We measured the time-reversal symmetry of spontaneous electroencephalographic brain activity recorded from three groups of patients and their respective control group under two experimental conditions. The results show that resting brain activity is generically time-irreversible at sufficiently long time scales, and that brain pathology is generally associated with a reduction in time-asymmetry, albeit with pathology-specific patterns. We will finally discuss a possible dynamical aetiology.

    Lubomir Kostal ( "Critical size of neural population for reliable information transmission"
    Optimal information decoding serves as a guiding principle for understanding fundamental questions in theoretical neurosciences. The problem often involves analysis of the Shannon limits for information representation and transmission in neural populations. It is well known that the probability of decoding error has a phase transition at information rate equal to channel capacity. The corresponding thermodynamic limit, however, requires the coding dimension to tend to infinity, thus making the actual decoding practically impossible. In this paper we focus on the finite-size effects that occur in realistically limited neural populations. We examine the achievable information rate in dependence on the population size, and illustrate our findings assuming independent Hodgkin-Huxley neurons responding to individual stimulus patterns. We report that, remarkably, the achievable rate approaches the asymptote in a strikingly non-linear manner as the number of active neurons increases. We identify the critical population size below which reliable information transmission deviates significantly from the fundamental limit. Qualitatively, our findings do not seem to depend on the details of the neuronal model. We hope that our results will stimulate further research into non-asymptotic phenomena and their impact on optimal neural population size or structure.

    Previous Workshops

    This workshop has been run at CNS for over a decade now -- links to the websites for the previous workshops in this series are below:

    1. CNS*2018 Workshop, July 17-18, 2018, Seattle, USA.
    2. CNS*2017 Workshop, July 19-20, 2017, Antwerp, Belgium.
    3. CNS*2016 Workshop, July 6-7, 2016, Jeju, South Korea.
    4. CNS*2015 Workshop, July 22-23, 2015, Prague, Czech Republic.
    5. CNS*2014 Workshop, July 30-31, 2014, Québec City, Canada.
    6. CNS*2013 Workshop, July 17-18, 2013, Paris, France.
    7. CNS*2012 Workshop, July 25-26, 2012, Atlanta/Decatur, GA, USA.
    8. CNS*2011 Workshop, July 27-28, 2011, Stockholm, Sweden.
    9. CNS*2010 Workshop, July 29-30, 2010, San Antonio, TX, USA.
    10. CNS*2009 Workshop, July 22-23, 2009, Berlin, Germany.
    11. CNS*2008 Workshop, July 23-24, 2008, Portland, OR, USA.
    12. CNS*2007 Workshop, July 11-12, 2007, Toronto, Canada.
    13. CNS*2006 Workshop, June 19-20, 2006, Edinburgh, U.K.

    Image modified from an original credited to, obtained here (distributed without restrictions); modified image available here under CC-BY-3.0