Browsed by
Category: Brain-Computer Interface (BCIs)

Brain Networks and the Cluster Variation Method: Testing a Scale-Free Model

Brain Networks and the Cluster Variation Method: Testing a Scale-Free Model

Surprising Result Modeling a Simple Scale-Free Brain Network Using the Cluster Variation Method One of the primary research thrusts that I suggested in my recent paper, The Cluster Variation Method: A Primer for Neuroscientists, was that we could use the 2-D Cluster Variation Method (CVM) to model distribution of configuration variables in different brain network topologies. Specifically, I was expecting that the h-value (which measures the interaction enthalpy strength between nodes in a 2-D CVM grid) would change in a…

Read More Read More

The Cluster Variation Method: A Primer for Neuroscientists

The Cluster Variation Method: A Primer for Neuroscientists

Single-Parameter Analytic Solution for Modeling Local Pattern Distributions The cluster variation method (CVM) offers a means for the characterization of both 1-D and 2-D local pattern distributions. The paper referenced at the end of this post provides neuroscientists and BCI researchers with a CVM tutorial that will help them to understand how the CVM statistical thermodynamics formulation can model 1-D and 2-D pattern distributions expressing structural and functional dynamics in the brain. The equilibrium distribution of local patterns, or configuration…

Read More Read More

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1 In the previous blogpost, I introduced the Kullback-Leibler divergence as an essential information-theoretic tool for researchers, designers, and practitioners interested in not just Brain-Computer Interfaces (BCIs), but specifically in Brain-Computer Information Interfaces (BCIIs). The notion of Mutual Information (MI) is also fundamental to information theory, and it can be expressed in terms of the Kullback-Leibler divergence. Mutual Information is given as: Mutual Information Notation I(x,y) is the mutual information of two…

Read More Read More

The Single Most Important Equation for Brain-Computer Information Interfaces

The Single Most Important Equation for Brain-Computer Information Interfaces

The Kullback-Leibler Divergence Equation for Brain-Computer Information Interfaces The Kullback-Leibler equation is arguably the best place for starting our thoughts about information theory as applied to Brain-Computer Interfaces (BCIs), or Brain-Computer Information Interfaces (BCIIs). The Kullback-Leibler equation is given as: We seek to express how well our model of reality matches the real system. Or, just as usefully, we seek to express the information-difference when we have two different models for the same underlying real phenomena or data. The K-L…

Read More Read More

Biologically-Based Multisensor Fusion for Brain-Computer Interfaces

Biologically-Based Multisensor Fusion for Brain-Computer Interfaces

Multisensor Fusion for Brain-Computer Interfaces (BCIs) More than 25 years ago, sensor fusion was identified as a militarily critical technology. (See blog post describing role of sensor fusion for Navy air traffic control.) Since that time, both our knowledge of – and the importance of – sensor fusion has grown substantially. Groundbreaking work by Barry Stein and M. Alex Meredith, at the Bowman Grey School of Medicine at Wake Forrest University, elucidated the specific mechanisms of biological sensor fusion in…

Read More Read More