I just have a quick question, maybe I am too nit picky here.
We recently had an introductory lecture to AI in university and the professor talked about McCulloch-Pitts neurons, e.g. activation as soon as the sum of inputs is reaching a certain threshold amount. My problem is that the professor said that the threshold amount is also called "bias", is that correct?
I thought biases are analogous to the y intercept of a linear equation and added on every equation made with the neural network, in order to balance out systematic error in predictions.
To be specific, my question is, whether the threshold met in a McCulloch-Pitts neuron is actually called bias or not.
Any hints appreciated! :)
Best regards, Sam!