Alianna J. Maren, Ph.D.
Brain-Based Computing: Beyond Deep Learning

Statistical thermodynamics, or statistical mechanics, is a remarkably esoteric topic; it’s full of equations and abstract concepts. And yet, statistical thermodynamics is essential to the next generation of neural networks and machine learning, so we need to understand at least the rudiments.

How you can learn with the least amount of pain


Alianna J. Maren, Ph.D.
Alianna J. Maren, Ph.D.

Yes, of course I’m writing a book. (We teach best what we most need to learn, according to Richard Bach.) But before I get the book written, or articles published, there are all kinds of crucial elements that I’m publishing as very fast turn-around, right here in my blogs.

My blogs are addressing crucial topics, and if you’re building your deep learning and/or machine learning understanding, they’ll give you valuable insights.

Interested in getting on top of this? Join the conversation; use the Opt-In form in the sidebar to the right.



Hi, welcome in!

I’m Alianna Maren, and my students call me Dr. A.J.

Did you know that deep learning (DL) is already hitting a wall?

Yes. It really is. While numerous blogs and articles are touting deep learning breakthroughs, and people are massively enrolling in deep learning courses (my deep learning classes at Northwestern University are maxed out within days after enrollment opens), the scary, shocking, awful truth is: deep learning is old news.

The algorithms (backpropagation, convolution neural networks (CNNs), autoencoders, etc.) are from the 1980’s to the 1990’s. There are major problems that deep learning, for all that it is stacking the building blocks, cannot solve.

The next generation of computer intelligence will move beyond these algorithms, and into more brain-like processes. And that, my friend, will involve some statistical thermodynamics, because that is one of the most important processes going on in the brain.

As a serial inventor (with four patents and over $15M investments, from government and venture capital), I live in a “decade-ahead” kind of world. My inventions have tackled the toughest problems, ranging from sensor fusion (for the U.S. Navy) to knowledge discovery (post 9/11, in support of various intelligence initiatives).

Now, though, I’ve focused on what I believe will be the next multi-decade challenge in artificial intelligence; creating brain-based machine learning methods that will help systems become much savvier about where they are in space and time. As part of the solution, I believe that the CORTECON II will help us move into a new AI era. Learn more about this temporal persistence method.

For more on my background, please go to my biography.


The Latest: Opportunities to Study with Me

Sometimes I can teach artificial intelligence (AI) and deep learning (DL) through Northwestern University. You can see my course schedule through a link on Classes and Courses.

If you can’t take the courses through Northwestern, THERE ARE USEFUL OPTIONS. (Reveal date: October 25, 2017.)


The Latest: Inventions

To meet the challenge of temporal persistence in neural networks, I’ve been developing the CORTECON II.