"But what is new in recent months is the growing speed and accuracy of
deep-learning programs, often called artificial neural networks or just
“neural nets” for their resemblance to the neural connections in the
brain
'There has been a number of stunning new results with deep-learning
methods,” said Yann LeCun, a computer scientist at New York University
who did pioneering research in handwriting recognition at Bell
Laboratories. “The kind of jump we are seeing in the accuracy of these
systems is very rare indeed.' "
"...we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time..."
Folks, this is an astounding result! One would imagine that training a neural net with N layers would require adjusting weights of all N layers simultaneously. If instead, training can be done one layer at a time the computational complexity is reduced by a factor of N.
Furthermore, it is natural to conjecture that all, or almost all, evolutionary processes are in essence greedy algorithms. Something like this algorithm likely explains how evolution has any chance of creating brains!