Questions tagged [perceptron]

For questions about the perceptron learning algorithm in Machine Learning.

Perceptron is a machine learning algorithm that helps provide classified outcomes for computing. It dates back to the 1950s and represents a fundamental example of how machine learning algorithms work to develop data.

The perceptron is an algorithm for supervised learning of binary classifiers (functions that can decide whether an input, represented by a vector of numbers, belongs to some specific class or not). It is a type of linear classifier, i.e. a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector. The algorithm allows for online learning, in that it processes elements in the training set one at a time.

Perceptron - Wikipedia

Perceptron - Techopedia

51 questions
14
votes
4 answers

Did Minsky and Papert know that multi-layer perceptrons could solve XOR?

In their famous book entitled Perceptrons: An Introduction to Computational Geometry, Minsky and Papert show that a perceptron can't solve the XOR problem. This contributed to the first AI winter, resulting in funding cuts for neural networks.…
rcpinto
  • 2,089
  • 1
  • 16
  • 31
9
votes
2 answers

What are the main differences between a perceptron and a naive Bayes classifier?

What are the main differences between a perceptron and a naive Bayes classifier?
user3642
8
votes
2 answers

Why is the perceptron criterion function differentiable?

I'm reading chapter one of the book called Neural Networks and Deep Learning from Aggarwal. In section 1.2.1.1 of the book, I'm learning about the perceptron. One thing that book says is, if we use the sign function for the following loss function:…
7
votes
1 answer

Which Rosenblatt's paper describes Rosenblatt's perceptron training algorithm?

I struggle to find Rosenblatt's perceptron training algorithm in any of his publications from 1957 - 1961, namely: Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms The perceptron: A probabilistic model for information…
7
votes
2 answers

How should we interpret this figure that relates the perceptron criterion and the hinge loss?

I am currently studying the textbook Neural Networks and Deep Learning by Charu C. Aggarwal. Chapter 1.2.1.2 Relationship with Support Vector Machines says the following: The perceptron criterion is a shifted version of the hinge-loss used in…
6
votes
5 answers

Why can't the XOR linear inseparability problem be solved with one perceptron like this?

Consider a perceptron where $w_0=1$ and $w_1=1$: Now, suppose that we use the following activation function \begin{align} f(x)= \begin{cases} 1, \text{ if }x =1\\ 0, \text{ otherwise} \end{cases} \end{align} The output is then summarised…
6
votes
2 answers

Is there a proof to explain why XOR cannot be linearly separable?

Can someone explain to me with a proof or example why you can't linearly separate XOR (and therefore need a neural network, the context I'm looking at it in)? I understand why it's not linearly separable if you draw it graphically (e.g. here), but I…
Slowat_Kela
  • 287
  • 2
  • 9
5
votes
1 answer

What's the difference between a "perceptron" and a GLM?

In a comment to this question user nbro comments: As a side note, "perceptrons" and "neural networks" may not be the same thing. People usually use the term perceptron to refer to a very simple neural network that has no hidden layer. Maybe you…
R.M.
  • 298
  • 1
  • 6
5
votes
1 answer

Why did the developement of neural networks stop between 50s and 80s?

In a video lecture on the development of neural networks and the history of deep learning (you can start from minute 13), the lecturer (Yann LeCunn) said that the development of neural networks stopped until the 80s because people were using the…
5
votes
1 answer

Which part of "Perceptrons: An Introduction to Computational Geometry" tells that a perceptron cannot solve the XOR problem?

In the book "Perceptrons: An Introduction to Computational Geometry" by Minsky and Papert (1969), which part of this book tells that a single-layer perceptron could not solve the XOR problem? I have been already scanned it, but I did not find the…
rimbaerl
  • 51
  • 2
3
votes
0 answers

If we use a perceptron with a non-monotonic activation function, can it solve the XOR problem?

I found several papers about how to build a perceptron able to solve the XOR problem. The papers describe a solution where the heaviside step function is replaced by a non-monotonic activation function. Here are the papers: Single Layer Neural…
3
votes
1 answer

What is the significance of weights in a feedforward neural network?

In a feedforward neural network, the inputs are fed directly to the outputs via a series of weights. What purpose do the weights serve, and how are they significant in this neural network?
kenorb
  • 10,423
  • 3
  • 43
  • 91
3
votes
2 answers

Perceptron learning algorithm: different accuracies for different training methods

So, my question is a bit theoretical. I have been trying to implement a perceptron based classifier with outputs 1 and 0 depending on the category. I have used 2 methods: The example by Example learning method and Batch learning method. I also have…
user9947
3
votes
1 answer

Is there any variant of perceptron convergence algorithm that ensures uniqueness?

The perceptron convergence algorithm given below ensures the convergence of weights of the perceptron provided enough data points and iterations. Although it ensures convergence by finally getting a decision hyperplane that can separate positive…
hanugm
  • 3,571
  • 3
  • 18
  • 50
3
votes
1 answer

What is the simplest classification problem which cannot be solved by a perceptron?

What is the simplest classification problem which cannot be solved by a perceptron (that is a single-layered feed-forward neural network, with no hidden layers and step activation function), but it can be solved by the same network if the activation…
1
2 3 4