Browsed by
Tag: backpropagation

New! The YouTube Vid Series: Backpropagation and More

New! The YouTube Vid Series: Backpropagation and More

If you are branding yourself as an AI/neural networks/deep learning person, how well do you really know the backpropagation derivation? That is, could you work through that derivation, on your own, without having to find a tutorial on it? If not – you’re in good company. MOST people haven’t worked through that derivation – and for good reason. MOST people don’t remember their chain rule methods from first semester calculus. (It probably doesn’t help if I say that the backprop…

Read More Read More

Generative vs. Discriminative – Where It All Began

Generative vs. Discriminative – Where It All Began

Working Through Salakhutdinov and Hinton’s “An Efficient Learning Procedure for Deep Boltzmann Machines”   We can accomplish a lot, using multiple layers trained with backpropagation. However (as we all know), there are limits to how many layers that we can train at once, if we’re relying strictly on backpropagation (or any other gradient-descent learning rule). This is what stalled out the neural networks community, from the mid-1990’s to the mid-2000’s. The breakthrough came from Hinton and his group, with a…

Read More Read More

Backpropagation: Not Dead, Not Yet

Backpropagation: Not Dead, Not Yet

Backpropagation: Why It Still Matters   Thirty years ago, at the dawn of the neural networks era, backpropagation was all the rage. In the minds of most people, it was infinitely preferable to the simulated annealing algorithm that Hinton et al. had proposed for their Boltzmann machine. Now, it seems as though the see-saw of algorithm popularity has shifted; we’re focused on energy-based methods. We might be asking: is backpropagation old hat? Good question! Even more than that, someone coming…

Read More Read More

Neural Networks and Python Code: Be Careful with the Array Indices!

Neural Networks and Python Code: Be Careful with the Array Indices!

Our Special Topics class on Deep Learning (Northwestern University, Master of Science in Predictive Analytics program, Winter, 2017) starts off with very basic neural networks: the backpropagation learning method applied to the classic X-OR problem. I’m writing Python code to go with this class, and the result by the end of the quarter should be five-to-six solid pieces of code, involving either the backpropagation or Boltzmann machine learning algorithm, with various network configurations. The following figure shows the dependence of…

Read More Read More