Third Stage Boost: Statistical Mechanics and Neuromorphic Computing – Part 1

Third Stage Boost: Statistical Mechanics and Neuromorphic Computing – Part 1

Next-Generation Neural Network Architectures: More Brain-Like

 

Three generations of artificial intelligence.. The third generation is emerging … right about … now.

That’s what is shown in this figure, presented in log-time scale.

Brief history of AI in log-time scale
Brief history of AI in log-time scale

Brief history of AI in log-time scale

The first generation of AI, symbolic AI, began conceptually around 1954, and lasted until 1986; 32 years. On the log-time scale shown in the figure above, this entire era takes place under the first curve; the black bell-shaped curve on the far left.

The first generation was all classic, symbolic AI. It used first-order predicate logic, and languages such as Prolog and LISP. In this approach, everything had a unique, symbolic (declarative) representation. The “thinking” in this language was often done via rules, such as were put into expert system (procedural) representations. Some of the kick-off inventions of this time were Alan Newell and Herbert Simon’s invention of GPS, or General Problem Solver, in 1958, and John McCarthy’s invention of LISP in 1959.

During this era, cognitive science evolved hand-in-hand with AI, and the notion of knowledge representation emerged. (See some knowledge representation readings.)

This approach has been quietly resurfacing; witness Google’s Knowledge Graph and many other knowledge-based AI systems.

However, as those of us who are old enough to remember, this era hit a stone wall in the mid-1980’s. This was largely due to the rigidness of expert systems, and the fact that they were so difficult to maintain and upgrade.

 
Italian-renaissance-border-2-thin

 

Enter Connectionist (Neural Network) Computing

 

Neural networks emerged in 1986, with publication of Parallel Distributed Processing, quickly followed by the first International Conference on Neural Networks in 1987.

As we all know, the realm of neural networks hit its own substantial lull from the mid-1990’s through the mid-2000’s, and re-emerged beginning around 2005, thanks to the tenacity of Geoff Hinton and his teams of graduate students. With what Hinton and others termed deep learning, the field of connectionist computing has taken on rip-roaring life over the past decade, bursting through many of the decades-old barriers in practical AI applications.

Winners and key players in the ImageNet competition over the last several years.
Winners and key players in the ImageNet competition over the last several years.

As an illustration, the figure above expands on the first figure, with a focus on connectionist image understanding methods, as developed over the past decade. We see that substantial breakthroughs are happening faster and faster. They are showing up approximately linear on the collapsing log-time scale.

All in all, the current connectionist wave has lasted – also – about 32 years.

It’s worth noting – most of what we’re working with now is not radically different than the algorithms developed and introduced in the 1980’s. We’ve mostly improved their performance, not changed the fundamental approach.

In 2018, it will be thirty-two years since Parallel Distributed Processing was published in 1986.

Generative adversarial networks (GANs), which are the latest focus of attention, are more of a fusion or hybrid approach than they are something very new and different. We can see from both of these figures that they mark a culmination of advances, all within these last thirty-two years.

 
Italian-renaissance-border-2-thin

 

Something Else Will Happen Next

 

The Great Wave off Kanagawa, by Katsushika Hokusai (1829–1833). Color woodblock. Metropolitan Museum of Art (JP1847). Public domain image.
The Great Wave off Kanagawa, by Katsushika Hokusai (1829–1833). Color woodblock. Metropolitan Museum of Art (JP1847). Public domain image.

I believe that we’re on the crest of the deep learning wave.

And that means that we’re almost ready for something new.

The big question is: what will this something new look like?

Let me invite you to read the article by von Bubnoff on A brain built from atomic switches [that] can learn.

We’ll pick up with this topic again, next week.

We’ll look in more detail about the bread-crumb trail that can lead us to this realization.

We’ll also look at what it means for those of us who are transitioning to AI-based careers. What does it mean, in practical terms, if the fundamental nature of what we’re thinking of as AI computing shifts radically within the next decade?

More importantly, how do we prepare?

Right now, many of us feel as though we’re running to catch the moving train.

Let’s see if we can shift our analogy.

Let’s look at what it would take to think of ourselves as a rocket that is using multiple stages to join up with a target that is, itself, already moving.

We just need to figure out where it is likely to be by the time that we can get there.

 
Italian-renaissance-border-2-thin

 

Live free or die, my friend –

AJ Maren

Live free or die: Death is not the worst of evils.
Attr. to Gen. John Stark, American Revolutionary War

 
Italian-renaissance-border-2-thin
 

References – Neuromorphic Computing

 

  • von Bubnoff, A. A Brain Built From Atomic Switches Can Learn, Quanta (Sept. 20, 2017). online article.
  • Gomes, L. Neuromorphic Chips Are Destined for Deep Learning—or Obscurity, IEEE Spectrum (29 May 2017), online article.
  • Li, W.Y., Ovchinnikov, I.V., Chen, H.L., Wang, Z., Lee, A., Lee, H.C., Cepeda, C., Schwartz, R.N., Meier, K., and Wang, K.L., A neuronal dynamics study on a neuromorphic chip, arXiv 1703:03560 (2017). pdf.

 
Italian-renaissance-border-2-thin
 

Some Useful Background Reading on Statistical Mechanics

 

  • Hermann, C. Statistical Physics – Including Applications to Condensed Matter, in Course Materials for Chemistry 480B – Physical Chemistry (New York: Springer Science+Business Media), 2005. pdf. Very well-written, however, for someone who is NOT a physicist or physical chemist, the approach may be too obscure.
  • Maren, A.J. Statistical Thermodynamics: Basic Theory and Equations, THM TR2013-001(ajm) (Dec., 2013) Statistical Thermodynamics: Basic Theory and Equations.
  • Salzman, R. Notes on Statistical Thermodynamics – Partition Functions, in Course Materials for Chemistry 480B – Physical Chemistry, 2004. Statistical Mechanics (chapter). Online book chapter. This is one of the best online resources for statistical mechanics; I’ve found it to be very useful and lucid.
  • Tong, D. Chapter 1: Fundamentals of Statistical Mechanics, in Lectures on Statistical Physics (University of Cambridge Part II Mathematical Tripos), Preprint (2011). pdf.

 
Italian-renaissance-border-2-thin
 

Previous Related Posts

 
Italian-renaissance-border-2-thin
 

Return to:

8 thoughts on “Third Stage Boost: Statistical Mechanics and Neuromorphic Computing – Part 1

Leave a Reply

Your email address will not be published. Required fields are marked *