Browsed by
Category: Inference

Interpreting Karl Friston (Round Deux)

Interpreting Karl Friston (Round Deux)

He might be getting a Nobel prize some day. But – no one can understand him. You don’t believe me? Have a quick glance at Scott Alexander’s article, “God Help Us, Let’s Try To Understand Friston On Free Energy”. We’re referring, of course, to Karl Friston. I’ve spent the past three-and-a-half years studying Friston’s approach to free energy, which he treats as the guiding principle in the brain. He has extended the classic variational Bayes treatment (frontier-material in machine learning)…

Read More Read More

Wrapping Our Heads Around Entropy

Wrapping Our Heads Around Entropy

Entropy – the Most Powerful Force in the ‘Verse:   Actually, that’s not quite true. The most powerful force in the ‘verse is free energy minimization. However, entropy is half of the free energy equation, and it’s usually the more complex half. So, if we understand entropy, then we can understand free energy minimization. If we understand free energy minimization, then we understand all the energy-based machine learning models, including the (restricted) Boltzmann machine and one of its most commonly-used…

Read More Read More

Seven Essential Machine Learning Equations: A Cribsheet (Really, the Précis)

Seven Essential Machine Learning Equations: A Cribsheet (Really, the Précis)

Making Machine Learning As Simple As Possible Albert Einstein is credited with saying, Everything should be made as simple as possible, but not simpler. Machine learning is not simple. In fact, once you get beyond the simple “building blocks” approach of stacking things higher and deeper (sometimes made all too easy with advanced deep learning packages), you are in the midst of some complex stuff. However, it does not need to be more complex than it has to be.  …

Read More Read More

A Tale of Two Probabilities

A Tale of Two Probabilities

Probabilities: Statistical Mechanics and Bayesian:   Machine learning fuses several different lines of thought, including statistical mechanics, Bayesian probability theory, and neural networks. There are two different ways of thinking about probability in machine learning; one comes from statistical mechanics, and the other from Bayesian logic. Both are important. They are also very different. While these two different ways of thinking about probability are usually very separate, they come together in some of the more advanced machine learning topics, such…

Read More Read More

The Statistical Mechanics Underpinnings of Machine Learning

The Statistical Mechanics Underpinnings of Machine Learning

Machine Learning Is Different Now:   Actually, machine learning is a continuation of what it always has been, which is deeply rooted in statistical physics (statistical mechanics). It’s just that there’s a culmination of insights that are now a very substantive body of work, with more theoretical rigor behind them than most of us know.     A Lesson from Mom: It takes a lot of time to learn a new discipline. This is something that I learned from my…

Read More Read More

Seven Statistical Mechanics / Bayesian Equations That You Need to Know

Seven Statistical Mechanics / Bayesian Equations That You Need to Know

Essential Statistical Mechanics for Deep Learning   If you’re self-studying machine learning, and feel that statistical mechanics is suddenly showing up more than it used to, you’re not alone. Within the past couple of years, statistical mechanics (statistical thermodynamics) has become a more integral topic, along with the Kullback-Leibler divergence measure and several inference methods for machine learning, including the expectation maximization (EM) algorithm along with variational Bayes.     Statistical mechanics has always played a strong role in machine…

Read More Read More