Calculus on Computational Graphs: Backpropagation -- colah's blog
Posted on August 31, 2015
Introduction
Backpropagation is the key algorithm that makes training deep models computationally tractable. For modern neural networks, it can make training with gradient descent as much as ten million times faster, relative to a naive implementation. That’s the difference between a model taking a week to train and taking 200,000 years.
Beyond its use in deep learning, backpropagation is a powerful computational tool in many other areas, ranging from weather forecastin...
Read more at colah.github.io