Browsed by
Category: Machine Learning

Interpreting Karl Friston (Round Deux)

Interpreting Karl Friston (Round Deux)

He might be getting a Nobel prize some day. But – no one can understand him. You don’t believe me? Have a quick glance at Scott Alexander’s article, “God Help Us, Let’s Try To Understand Friston On Free Energy”. We’re referring, of course, to Karl Friston. I’ve spent the past three-and-a-half years studying Friston’s approach to free energy, which he treats as the guiding principle in the brain. He has extended the classic variational Bayes treatment (frontier-material in machine learning)…

Read More Read More

Start Here: Statistical Mechanics for Neural Networks and AI

Start Here: Statistical Mechanics for Neural Networks and AI

Your Pathway through the Blog-Maze: What to read, and what order to read things in, if you’re trying to teach yourself the rudiments of statistical mechanics – just enough to get a sense of what’s going on in the REAL deep learning papers. As we all know, there’s two basic realms of deep learning neural networks. There’s the kind that only requires (some, limited) knowledge of backpropagation. That’s first semester undergraduate calculus, and almost everyone coming into this field can…

Read More Read More

Generative vs. Discriminative – Where It All Began

Generative vs. Discriminative – Where It All Began

Working Through Salakhutdinov and Hinton’s “An Efficient Learning Procedure for Deep Boltzmann Machines”   We can accomplish a lot, using multiple layers trained with backpropagation. However (as we all know), there are limits to how many layers that we can train at once, if we’re relying strictly on backpropagation (or any other gradient-descent learning rule). This is what stalled out the neural networks community, from the mid-1990’s to the mid-2000’s. The breakthrough came from Hinton and his group, with a…

Read More Read More

We’ve Been Really and Truly **cked (Insert a consonant and vowel of your choice)

We’ve Been Really and Truly **cked (Insert a consonant and vowel of your choice)

High-Precision Mind-**cking   You already know the main storyline: Cambridge Analytica, Brietbart, Facebook, and possible other players. Trump’s win of the electoral vote by about 40,000 votes through carefully targeting not only certain swing states, but micro-elements within those states. The questions now are (for those of us techie folks): (1) Technically, just how did this happen? (We want more than the few words in the mainstream news), and (2) (That which really interests us:) What are the countermeasures? One…

Read More Read More

What We Really Need to Know about Entropy

What We Really Need to Know about Entropy

There’s This Funny Little “Gotcha” Secret about Entropy: Nobody mentions this secret. (At least in polite society.) But here’s the thing – entropy shows up in all sorts of information theory and machine learning algorithms. And it shows up ALONE, as though it sprung – pure and holy – from the head of the famed Ludwig Boltzmann. What’s wrong with this is that: entropy never lives alone, in isolation. In the real world, entropy exists – always – hand-in-hand with…

Read More Read More

Wrapping Our Heads Around Entropy

Wrapping Our Heads Around Entropy

Entropy – the Most Powerful Force in the ‘Verse:   Actually, that’s not quite true. The most powerful force in the ‘verse is free energy minimization. However, entropy is half of the free energy equation, and it’s usually the more complex half. So, if we understand entropy, then we can understand free energy minimization. If we understand free energy minimization, then we understand all the energy-based machine learning models, including the (restricted) Boltzmann machine and one of its most commonly-used…

Read More Read More

Artificial General Intelligence: Getting There from Here

Artificial General Intelligence: Getting There from Here

What We Need to Create Artificial General Intelligence (AGI):   A brief recap: We know that we want to have neural networks (including deep learning) do something besides being sausage factories. We’ve know that the key missing step – a first principles step – to making this happen is to give the network something to do when it is not responding to inputs. Also, we’ve introduced something that the neural network CAN do; it can do free energy minimization with…

Read More Read More

The Big, Bad, Scary Free Energy Equation (and New Experimental Results)

The Big, Bad, Scary Free Energy Equation (and New Experimental Results)

The 2-D Cluster Variation Method Free Energy Equation – in All Its Scary Glory:   You know, my dear, that we’ve been leading up to this moment for a while now. I’ve hinted. I’ve teased and been coy. But now, it’s time to be full frontal. We’re going to look at a new form of a free energy equation; a cluster variation method (CVM) equation. It deals not only with how many units are in state A or state B,…

Read More Read More

A “Hidden Layer” Guiding Principle – What We Minimally Need

A “Hidden Layer” Guiding Principle – What We Minimally Need

Putting It Into Practice: If we’re going to move our neural network-type architectures into a new, more powerful realm of AI capability, we need to bust out of the “sausage-making” mentality that has governed them thus far, as we discussed last week. To do this, we need to give our hidden layer(s) something to do besides respond to input stimulus. It’s very realistic that this “something” should be free energy minimization, because that’s one of the strongest principles in the…

Read More Read More

Statistical Mechanics, the Future of AI, and Personal Stories

Statistical Mechanics, the Future of AI, and Personal Stories

Statistical Mechanics and Personal Stories (On the Same Page!)   Yikes! It’s Thursday morning already. I haven’t written to you for three weeks. That’s long enough that I have to pause and search my memory for my username to get into the website. Thanksgiving was lovely. The Thursday after that was grading, all day – and for several days before and after. By now, I (and most of you) have had a few days of recovery, from what has been…

Read More Read More