2

The only algorithm I know for updation of weights of a neural network is based on gradients. The update equation can be roughly written as

$$w \leftarrow w - \nabla_{w}L$$

where $\nabla_{w}L$ is the gradient of loss function with respect to weights.

Are there any learning algorithms for updating weights in neural networks that does not use gradients?

hanugm
  • 3,571
  • 3
  • 18
  • 50

1 Answers1

4

Yes.

A prominent class of "gradient-free" algorithms in ML world is known as Evolution Strategies (ES). Evolutionary Algorithms, although existed for a long time, only a few have shown to scale well.

Recently, the research group OpenAI managed to train Deep RL models with a specific variant of ES (with careful engineering). You can read this paper. This blog by David Ha provides a starting point if you want to learn about ES and its modern derivatives.

ayandas
  • 238
  • 1
  • 5