A neural network can apparently be denoted as $N_{t,n,\sigma,L}$. What do these subscripts $t, n, \sigma$ and $L$ mean? Could you link me to a paper, article or webpage with an explanation for this?
Asked
Active
Viewed 122 times
1 Answers
4
Here is a paper with the mathematical definition of each term:
Let Nt,n,σ,L be all target functions that can be implemented using a neural network of depth t, size n, activation function σ, and when we restrict the input weights of each neuron to be |w|1 + |b| ≤ L.

serali
- 890
- 6
- 16
-
Does "size $n$" mean that the network has $n$ input nodes? – J. Doe Nov 13 '19 at 09:08
-
1Check the introduction: "The depth of the network is the number of layers and the size of the network is the total number of neurons." – serali Nov 13 '19 at 09:13