0

I just have a quick question, maybe I am too nit picky here.

We recently had an introductory lecture to AI in university and the professor talked about McCulloch-Pitts neurons, e.g. activation as soon as the sum of inputs is reaching a certain threshold amount. My problem is that the professor said that the threshold amount is also called "bias", is that correct?

I thought biases are analogous to the y intercept of a linear equation and added on every equation made with the neural network, in order to balance out systematic error in predictions.

To be specific, my question is, whether the threshold met in a McCulloch-Pitts neuron is actually called bias or not.

Any hints appreciated! :)

Best regards, Sam!

DerOeko
  • 13
  • 3

1 Answers1

1

Both uses of the word bias are appropriate, and they are consistent.

In a threshold perceptron, the output $a$ (from activation) is given by

$a=$

$1$ if $wx$$+b > 0$

$0$ otherwise,

so the output indicates whether $wx$ is larger than $-b$ or not.

Recall the linear equation before the threshold, where $w$ is a vector of weights $w_i$ for the inputs $x_i$. If you have only one input, $x_1$, this single weight $w=w_1$ corresponds to the slope of the straight line $y=w_1 x_1+b$.

If your goal is prediction by linear regression, instead of comparison, you may use a linear neuron, with the identity function as an activation function instead of the threshold. Here the output is simply the linear combination of the inputs

$a=$$wx$$+b$

In this case the function of the bias is shifting the prediction obtained from multiplying the inputs by the weights only, as you mention.

  • So, the bias is called the threshold because wx has to be greater than -b, because otherwise a = 0, e.g. the threshold is not met? Wouldn't we then have to call -b the threshold and not b itself? – DerOeko Sep 01 '22 at 14:14
  • Mmm, I would not link threshold and bias. The bias is b and the threshold is 0, or the hyperplane wx+b=0. – Jaume Oliver Lafont Sep 01 '22 at 15:14