Questions tagged [hidden-layers]

For questions about functioning, applications, structure and improvement of performance of hidden layers in a Neural Network.

Hidden Layers are the layers which lies between the input and the output layers of a Neural Network. The parameters and the implementation details of hidden layers of a Neural Network are generally hidden from an user. Opinions vary about the true functioning, the purpose and the power lent by the hidden layers to the Neural Network.

What does the hidden layer in a neural network compute?

52 questions
20
votes
5 answers

Why does Batch Normalization work?

Adding BatchNorm layers improves training time and makes the whole deep model more stable. That's an experimental fact that is widely used in machine learning practice. My question is - why does it work? The original (2015) paper motivated the…
18
votes
2 answers

How do I decide the optimal number of layers for a neural network?

How do I decide the optimal number of layers for a neural network (feedforward or recurrent)?
13
votes
4 answers

What exactly is a hidden state in an LSTM and RNN?

I'm working on a project, where we use an encoder-decoder architecture. We decided to use an LSTM for both the encoder and decoder due to its hidden states. In my specific case, the hidden state of the encoder is passed to the decoder, and this…
11
votes
1 answer

What kind of problems require more than 2 hidden layers?

I've read that the most of the problems can be solved with 1-2 hidden layers. How do you know you need more than 2? For what kind of problems you would need them (give me an example)?
kenorb
  • 10,423
  • 3
  • 43
  • 91
10
votes
2 answers

What's the difference between hyperbolic tangent and sigmoid neurons?

Two common activation functions used in deep learning are the hyperbolic tangent function and the sigmoid activation function. I understand that the hyperbolic tangent is just a rescaling and translation of the sigmoid function: $\tanh(z) =…
9
votes
1 answer

Why aren't there neural networks that connect the output of each layer to all next layers?

Why aren't there neural networks that connect the output of each layer to all next layers? For example, the output of layer 1 would be fed to the input of layers 2, 3, 4, etc. Beyond computational power considerations, wouldn't this be better than…
8
votes
2 answers

Why should the number of neurons in a hidden layer be a power of 2?

I have read somewhere on the web (I lost the reference) that the number of units (or neurons) in a hidden layer should be a power of 2 because it helps the learning algorithm to converge faster. Is this a fact? If it is, why is this true? Does it…
7
votes
4 answers

What is the purpose of the hidden layers?

Why would anybody want to use "hidden layers"? How do they enhance the learning ability of the network in comparison to the network which doesn't have them (linear models)?
kenorb
  • 10,423
  • 3
  • 43
  • 91
7
votes
1 answer

Do all neurons in a layer have the same activation function?

I'm new to machine learning (so excuse my nomenclature), and not being a python developer, I decided to jump in at the deep (no pun intended) end writing my own framework in C++. In my current design, I have given each neuron/cell the possibility to…
7
votes
3 answers

Does each filter in each convolution layer create a new image?

Say I have a CNN with this structure: input = 1 image (say, 30x30 RGB pixels) first convolution layer = 10 5x5 convolution filters second convolution layer = 5 3x3 convolution filters one dense layer with 1 output So a graph of the network will…
7
votes
3 answers

To what does the number of hidden layers in a neural network correspond?

In a neural network, the number of neurons in the hidden layer corresponds to the complexity of the model generated to map the inputs to output(s). More neurons creates a more complex function (and thus the ability to model more nuanced decision…
SeeDerekEngineer
  • 521
  • 4
  • 11
6
votes
1 answer

How many nodes/hidden layers are required to solve a classification problem where the boundary is a sinusoidal function?

A single neuron is capable of forming a decision boundary between linearly seperable data. Is there any intuition as to how many, and in what configuration, would be necessary to correctly approximate a sinusoidal decision boundary? Thanks
6
votes
1 answer

Is this idea to calculate the required number of hidden neurons for a single hidden layer neural network correct?

I have an idea to find the optimal number of hidden neurons required in a neural network, but I'm not sure how accurate it is. Assuming that it has only 1 hidden layer, it is a classification problem with 1 output node (so it's a binary…
5
votes
2 answers

What type of neural network would be most feasible for playing a realtime game?

For implementing a neural network algorithm that can play air hockey, I had two ideas for input, and I'm trying to figure out which design would be most viable. The output must be two analog values that dictate the best position on half of the table…
4
votes
1 answer

How does a single hidden layer affect output?

I'm learning about multilayer perceptrons, and I have a quick theory question in regards to hidden layer neurons. I know we can use two hidden layers to solve a non-linearly separable problem by allowing for a representation with two linear…
1
2 3 4