Browsed by
Category: Variational Free Energy

A Tale of Two Probabilities

A Tale of Two Probabilities

Probabilities: Statistical Mechanics and Bayesian:   Machine learning fuses several different lines of thought, including statistical mechanics, Bayesian probability theory, and neural networks. There are two different ways of thinking about probability in machine learning; one comes from statistical mechanics, and the other from Bayesian logic. Both are important. They are also very different. While these two different ways of thinking about probability are usually very separate, they come together in some of the more advanced machine learning topics, such…

Read More Read More

Seven Statistical Mechanics / Bayesian Equations That You Need to Know

Seven Statistical Mechanics / Bayesian Equations That You Need to Know

Essential Statistical Mechanics for Deep Learning   If you’re self-studying machine learning, and feel that statistical mechanics is suddenly showing up more than it used to, you’re not alone. Within the past couple of years, statistical mechanics (statistical thermodynamics) has become a more integral topic, along with the Kullback-Leibler divergence measure and several inference methods for machine learning, including the expectation maximization (EM) algorithm along with variational Bayes.     Statistical mechanics has always played a strong role in machine…

Read More Read More

How to Read Karl Friston (in the Original Greek)

How to Read Karl Friston (in the Original Greek)

Karl Friston, whom we all admire, has written some lovely papers that are both enticing and obscure. Cutting to the chase, what we really want to understand is this equation: In a Research Digest article, Peter Freed writes: … And today, Karl Friston is not explaining [the free energy principle] in a way that makes it usable to your average psychiatrist/psychotherapist on the street – which is frustrating. I am not alone in my confusion, and if you read the…

Read More Read More

Approximate Bayesian Inference

Approximate Bayesian Inference

Variational Free Energy I spent some time trying to figure out the derivation for the variational free energy, as expressed in some of Friston’s papers (see citations below). While I made an intuitive justification, I just found this derivation (Kokkinos; see the reference and link below): Other discussions about variational free energy: Whereas maximum a posteriori methods optimize a point estimate of the parameters, in ensemble learning an ensemble is optimized, so that it approximates the entire posterior probability distribution…

Read More Read More

Big Data, Big Graphs, and Graph Theory: Tools and Methods

Big Data, Big Graphs, and Graph Theory: Tools and Methods

Big Graphs Need Specialized Data Storage and Computational Methods {A Working Blogpost – Notes for research & study} Processing large-scale graph data: A guide to current technology, by Sherif Sakr (ssakr@cse.unsw.edu.au), IBM Developer Works (10 June 2013). Note: Dr. Sherif Sakr is a senior research scientist in the Software Systems Group at National ICT Australia (NICTA), Sydney, Australia. He is also a conjoint senior lecturer in the School of Computer Science and Engineering at University of New South Wales. He…

Read More Read More

Analytic Single-Point Solution for Cluster Variation Method Variables (at x1=x2=0.5)

Analytic Single-Point Solution for Cluster Variation Method Variables (at x1=x2=0.5)

Single-Point Analytic Cluster Variation Method Solution: Solving Set of Three Nonlinear, Coupled Equations The Cluster Variation Method, first introduced by Kikuchi in 1951 (“A theory of cooperative phenomena,” Phys. Rev. 81 (6), 988-1003), provides a means for computing the free energy of a system where the entropy term takes into account distributions of particles into local configurations as well as the distribution into “on/off” binary states. As the equations are more complex, numerical solutions for the cluster variation variables are…

Read More Read More