Browsed by
Tag: Kullback-Leibler

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1

Brain-Computer Interfaces, Kullback-Leibler, and Mutual Information: Case Study #1 In the previous blogpost, I introduced the Kullback-Leibler divergence as an essential information-theoretic tool for researchers, designers, and practitioners interested in not just Brain-Computer Interfaces (BCIs), but specifically in Brain-Computer Information Interfaces (BCIIs). The notion of Mutual Information (MI) is also fundamental to information theory, and it can be expressed in terms of the Kullback-Leibler divergence. Mutual Information is given as: Mutual Information Notation I(x,y) is the mutual information of two…

Read More Read More

The Single Most Important Equation for Brain-Computer Information Interfaces

The Single Most Important Equation for Brain-Computer Information Interfaces

The Kullback-Leibler Divergence Equation for Brain-Computer Information Interfaces The Kullback-Leibler equation is arguably the best place for starting our thoughts about information theory as applied to Brain-Computer Interfaces (BCIs), or Brain-Computer Information Interfaces (BCIIs). The Kullback-Leibler equation is given as: We seek to express how well our model of reality matches the real system. Or, just as usefully, we seek to express the information-difference when we have two different models for the same underlying real phenomena or data. The K-L…

Read More Read More