4

I have a neural network that takes as an input a vector of $x \in R^{1\times d}$ with $s$ hidden layers and each layer has $d$ neurons (including the output layer).

If I understand correctly the computational complexity of the forward pass of a single input vector would be $O(d^{2}(s-1))$, where $d^{2}$ is the computational complexity for the multiplication of the output of each layer and the weight matrix, and this happens $(s-1)$ times, given that the neural network has $s$ layers. We can ignore the activation function because the cost is $O(d)$.

So, if I am correct so far, and the computational complexity of the forward pass is $O(d^{2}(s-1))$, is the following correct?

$$O(d^{2}(s-1)) = O(d^{2}s + d^{2}) = O(d^{2}s)$$

Would the computational complexity of the forward pass for this NN be $O(d^{2}s)$?

nbro
  • 39,006
  • 12
  • 98
  • 176
  • Welcome to SE:AI! This question has been closed as a duplicate. Please review the linked Q&A. (If not a duplicate, you can explain and submit for re-opening.) – DukeZhou Dec 02 '19 at 21:03

0 Answers0