Browsed by
Tag: Hopfield neural network

Future Directions in AI: Fundamentals (Part 1) – New YouTube Vid)

Future Directions in AI: Fundamentals (Part 1) – New YouTube Vid)

Are you an AI expert, or are you planning to be? There are three fundamental challenges that will underlie the major AI evolutions over the next decade. These are the three areas where you NEED to understand the fundamentals – before AI moves so fast that you’ll never catch up.  Let them guide your deep study for the year ahead.  Check them out in this new YouTube post: Live free or die, my friend – AJ Maren Live free or…

Read More Read More

Book Chapter: Draft Chapter 7 – The Boltzmann Machine

Book Chapter: Draft Chapter 7 – The Boltzmann Machine

Chapter 7: Energy-Based Neural Networks This is the full chapter draft from the book-in-progress, Statistical Mechanics, Neural Networks, and Artificial Intelligence. This chapter draft covers not only the Hopfield neural network (released as an excerpt last week), but also the Boltzmann machine, in both general and restricted forms. It deals with that form-equals-function connection, based on the energy equation. (However, we postpone the full-fledged learning method to a later chapter.) Get the pdf using the pdf link in the citation…

Read More Read More

Book Excerpt: Chapter 7

Book Excerpt: Chapter 7

Chapter 7: Energy-Based Neural Networks This is the first time that I’m sharing an excerpt from the book-in-progress, Statistical Mechanics, Neural Networks, and Artificial Intelligence. This excerpt covers the Hopfield neural network only; I’m still revising / editing / adding-to the remaining sections on the (general and restricted) Boltzmann machine. Get the pdf using the pdf link in the citation below: Maren, A.J. (In progress). Chapter 7: Introduction to Energy-Based Neural Networks: The Hopfield Network and the (Restricted) Boltzmann Machine…

Read More Read More

Start Here: Statistical Mechanics for Neural Networks and AI

Start Here: Statistical Mechanics for Neural Networks and AI

Your Pathway through the Blog-Maze: What to read, and what order to read things in, if you’re trying to teach yourself the rudiments of statistical mechanics – just enough to get a sense of what’s going on in the REAL deep learning papers. As we all know, there’s two basic realms of deep learning neural networks. There’s the kind that only requires (some, limited) knowledge of backpropagation. That’s first semester undergraduate calculus, and almost everyone coming into this field can…

Read More Read More

Generative vs. Discriminative – Where It All Began

Generative vs. Discriminative – Where It All Began

Working Through Salakhutdinov and Hinton’s “An Efficient Learning Procedure for Deep Boltzmann Machines”   We can accomplish a lot, using multiple layers trained with backpropagation. However (as we all know), there are limits to how many layers that we can train at once, if we’re relying strictly on backpropagation (or any other gradient-descent learning rule). This is what stalled out the neural networks community, from the mid-1990’s to the mid-2000’s. The breakthrough came from Hinton and his group, with a…

Read More Read More

A “Hidden Layer” Guiding Principle – What We Minimally Need

A “Hidden Layer” Guiding Principle – What We Minimally Need

Putting It Into Practice: If we’re going to move our neural network-type architectures into a new, more powerful realm of AI capability, we need to bust out of the “sausage-making” mentality that has governed them thus far, as we discussed last week. To do this, we need to give our hidden layer(s) something to do besides respond to input stimulus. It’s very realistic that this “something” should be free energy minimization, because that’s one of the strongest principles in the…

Read More Read More