In these lecture slides, it's written
The neuropsychologist Donald Hebb postulated in 1949 how biological neurons learn:
"When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place on one or both cells such that A's efficiency as one of the cells firing B, is increased."
In more familiar terminology, that can be stated as the Hebbian Learning rule:
- If two neurons on either side of a synapse (connection) are activated simultaneously (i.e. synchronously), then the strength of that synapse is selectively increased.
Mathematically, we can describe Hebbian learning as:
$$w_{i j}[n+1]=w_{i j}[n]+\eta x_{i}[n] x_{j}[n]$$
Here, $\eta$ is a learning rate coefficient, and $x$ are the outputs of the ith and jth elements.
So, my main question is: what do all these descriptions mean? Here are a few sub-questions.
- Is Hebbian learning applicable for single-neuron networks?
- What does "two neurons on either side of a synapse" mean?
- Why/when would two neurons activate "simultaneously"?
- What are these "elements"?