Questions tagged [prelu]

For questions about the Parametric Rectified Linear Unit (PReLU) activation function, which is proposed in "Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification" by Kaiming He et al.

1 questions
4
votes
1 answer

Why should one ever use ReLU instead of PReLU?

To me, it seems that PReLU is strictly better than ReLU. It does not have the dying ReLU problem, it allows negative values and it has trainable parameters (which are computationally negligible to adjust). Only if we want the network to output…