Browsed by
Author: AJMaren

Book Chapter: Draft Chapter 7 – The Boltzmann Machine

Book Chapter: Draft Chapter 7 – The Boltzmann Machine

Chapter 7: Energy-Based Neural Networks This is the full chapter draft from the book-in-progress, Statistical Mechanics, Neural Networks, and Artificial Intelligence. This chapter draft covers not only the Hopfield neural network (released as an excerpt last week), but also the Boltzmann machine, in both general and restricted forms. It deals with that form-equals-function connection, based on the energy equation. (However, we postpone the full-fledged learning method to a later chapter.) Get the pdf using the pdf link in the citation…

Read More Read More

Book Excerpt: Chapter 7

Book Excerpt: Chapter 7

Chapter 7: Energy-Based Neural Networks This is the first time that I’m sharing an excerpt from the book-in-progress, Statistical Mechanics, Neural Networks, and Artificial Intelligence. This excerpt covers the Hopfield neural network only; I’m still revising / editing / adding-to the remaining sections on the (general and restricted) Boltzmann machine. Get the pdf using the pdf link in the citation below: Maren, A.J. (In progress). Chapter 7: Introduction to Energy-Based Neural Networks: The Hopfield Network and the (Restricted) Boltzmann Machine…

Read More Read More

Book Progress

Book Progress

The Uber-Important Chapter 7 – Introduction to Energy-based Neural Networks:   I tell students that it’s like being on the Oregon Trail. All of the stochastic gradient-descent networks (up to and including Convolutional Neural Networks, or ConvNets, and Long Short-Term Memory networks, or LSTM networks) can be understood using backpropagation. This requires only that first-semester calculus background. Sure, grunting through the chain rule (many, many times) gets tedious. But it’s doable. In contrast, the energy-based networks are the heart and…

Read More Read More

The Yin and Yang of Learning Deep Learning

The Yin and Yang of Learning Deep Learning

Sometimes Leaning Into It Is Not Enough: You folks tend to be hyper-focused, hugely on-your-game types. (My TA, and one of my favorite people, described himself as “alpha-squared.” So true for a lot of you!) So given your alpha-ness (or your alpha-squared-ness), your dominant approach to mastering a new topic is to work like crazy. Read a whole lot of stuff, from original papers down to tech blogs and forums. You install code environments, teach yourselves all the latest and…

Read More Read More

Start Here: Statistical Mechanics for Neural Networks and AI

Start Here: Statistical Mechanics for Neural Networks and AI

Your Pathway through the Blog-Maze: What to read, and what order to read things in, if you’re trying to teach yourself the rudiments of statistical mechanics – just enough to get a sense of what’s going on in the REAL deep learning papers. As we all know, there’s two basic realms of deep learning neural networks. There’s the kind that only requires (some, limited) knowledge of backpropagation. That’s first semester undergraduate calculus, and almost everyone coming into this field can…

Read More Read More

New Experimental Results for the 2D CVM (Pretty!)

New Experimental Results for the 2D CVM (Pretty!)

2D CVM Free Energy Minimization Applied to a Large Scale Free-Like Pattern: New experimental results show that a scale free-like pattern undergoing 2D CVM-type free energy minimization undergoes substantial topology changes; this is the first time that free energy has been used to influence a 2D topography. While specific topographies would differ, depending on the both starting pattern as well as the (random) selection of flipped-nodes (for free energy reduction), the resulting topographies will likely have similar patterns for given…

Read More Read More

Entropy Trumps All (First Computational for the 2-D CVM)

Entropy Trumps All (First Computational for the 2-D CVM)

Computational vs. Analytic Results for the 2-D Cluster Variation Method:   Three lessons learned: first computational results for the 2-D Cluster Variation Method, or CVM. The first-results comparisons between analytic predictions and the actual computational results tell us three things: (1) the analytics are a suggestion, not an actual values-prediction, and the further that we go from zero-values for the two enthalpy parameters, the more that the two diverge, (2) topography is important (VERY important), and (3) entropy rules the…

Read More Read More

Filling Out the Phase Space Boundaries – 2-D CVM

Filling Out the Phase Space Boundaries – 2-D CVM

Configuration Variables Along the Phase Space Boundaries for a 2-D CVM   Last week’s blog showed how we could get x1 for a specific value of epsilon0, by taking the derivative of the free energy and setting it equal to zero. (This works for the special case where epsilon1 is zero, meaning that there is no interaction enthalpy.) Last week, we looked at one case, where epsilon0 = 1.0. This week, we take a range of epsilon0 values and find…

Read More Read More

Obvious, But Useful (Getting the Epsilon-0 Value when the Interaction Enthalpy Is Zero)

Obvious, But Useful (Getting the Epsilon-0 Value when the Interaction Enthalpy Is Zero)

  This Really Is Kind of Obvious, But …   There’s something very interesting that we can do to obtain values for the epsilon0 parameter. Let’s stay with the case where there is no interaction enthalpy. In that case, we want to find the epsilon0 value that corresponds to the x1 value at a given free energy minimum. Or conversely, given an epsilon0 value, can we identify the x1 where the free energy minimum occurs? Turns out that, for this…

Read More Read More

An Interesting Little Thing about the CVM Entropy (with Code)

An Interesting Little Thing about the CVM Entropy (with Code)

The 2-D CVM Entropy and Free Energy Minima when the Interaction Enthalpy Is Zero:   Today, we transition from deriving the equations for the Cluster Variation Method (CVM) entropies (both 1-D and 2-D) to looking at how these entropies fit into the overall context of a free energy equation. Let’s start with entropy. The truly important thing about entropy is that it gives shape and order to the universe. Now, this may seem odd to those of us who’ve grown…

Read More Read More