Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1

In the previous blogpost, I introduced the Kullback-Leibler divergence as an essential information-theoretic tool for researchers, designers, and practitioners interested in not just Brain-Computer Interfaces (BCIs), but specifically in Brain-Computer Information Interfaces (BCIIs).

The notion of Mutual Information (MI) is also fundamental to information theory, and it can be expressed in terms of the Kullback-Leibler divergence.

Mutual Information is given as:

Mutual-Information_v2_basic-eqn-crppd

Mutual Information Notation

  • I(x,y) is the mutual information of two separate distributions x and y.

Italian-renaissance-border-2-thin

Mutual Information, Information Theory, and Brain-Computer Information Interfaces

The goal of this particular blog thread is to give readers the necessary minimal tools needed to:

  1. Read journal papers describing experiments in brain-computer interfaces,
  2. Assess potential performance capabilities for a particular type of interface, and
  3. Develop performance evaluation and improvement protocols for any new BCIs.

An example from the literature is:

“Continuous tracking of a randomly moving visual stimulus provided a broad sample of
velocity and position space, reduced statistical dependencies between kinematic variables, and minimized the nonstationarities that are found in typical “step-tracking” tasks. These statistical features permitted the application of signal-processing and information-theoretic tools for the analysis of neural encoding.” (Paninski et al, 2004, see full citation in References at the end of this blog.)

From further in the same paper, under Temporal dynamics of encoding:

An information-theoretic analysis was used to provide a direct measure of position and velocity information available from the recorded neurons and to describe more quantitatively the temporal evolution of this encoding. The results … demonstrate that by observing the position or velocity of the hand it is possible to derive information about the activity of a given [precentral motor] MI neuron. The converse, by Bayes’s rule, is also true: information about position or velocity can be decoded from MI firing rates.

Their conclusions depend on the use of Mutual Information:

For this analysis the mutual information between the cell’s firing rate and the kinematics of the hand is computed as a function of Tau-Mutual-Information-Paninski-et-al-2004, Info-Eqn2_Mutual-Information-Paninski-et-al-2004. Here N-of-Zero_Mutual-Information-Paninski-et-al-2004 represents the cell’s activity in a given short time interval (here, 5 ms; the interval is taken to be short to avoid redundancy effects induced by the fact that the hand position and velocity change relatively slowly) and S-of-Tau_Mutual-Information-Paninski-et-al-2004 denotes the value of position or velocity some time Tau-Mutual-Information-Paninski-et-al-2004 before or after the current time, t = 0. This information statistic is an objective measure of how well these neurons are tuned for these behavioral variables; the more tuned a given cell is at a given value of Tau-Mutual-Information-Paninski-et-al-2004, the more highly separated are the probability distributions … and the higher the value of Info-Eqn2_Mutual-Information-Paninski-et-al-2004.

{This is a blogpost-in-progress; please check later for the updated version. AJM, 12/7/2014.}

Italian-renaissance-border-2-thin

Previous Related Blog Posts

References

  • Paninski, L., Fellows, M. R., Hatsopoulos, N. G., & Donoghue, J. P. (2004). Spatiotemporal tuning of motor cortical neurons for hand position and velocity. Journal of Neurophysiology, 91, 515532. pdf
  • Suminski, A.J., Tkach, D., & Hatsopoulos, N.G. (2009). Exploiting multiple sensory modalities in brain-machine interfaces. Neural Networks, 22, 1224-1234. pdf

Resources for Mutual Information Theory

  • Wiki on Mutual Information (includes relationship between MI and Kullback-Leibler divergence).
  • Cover, T.M. and Thomas, J.A. (1991). Elements of Information Theory (New York: Wiley). (A classic reference, cited by Paninski et al. (see above), I don’t have this, but I’ve looked at a little of Chapter 2 that was available online, and read the reviews, this is a book that I’ll order soon. One of the things that I like about this is the author’s clear love for the subject; they elevate prose into poetry through their sheer joy.)
  • Entropy Discussion (not as lovely as Cover & Thomas, but reasonably succinct).

Leave a Reply

Your email address will not be published. Required fields are marked *