2

As in https://en.wikipedia.org/wiki/Calculus_of_variations

The calculus of variations is a field of mathematical analysis that uses variations, which are small changes in functions and functionals, to find maxima and minima of functionals

The Gradient Descent algorithm is also a method to find minima of a function, it it a part of the Calculus of Variations?

Dee
  • 1,283
  • 1
  • 11
  • 35
  • 1
    We can explain calculus (especially differentiation) to a significant extent using calculus of variations (as per the definition in block quotes is concerned). But I don't think it's much useful after that since we use the $ del $ which is an operator and as such can't be treated like small variations as far as formal maths is concerned. Informally by varying weights by small amount you can still train the network, and produce a good approximation of the output from del operator, albeit be slow and noisy (which may or may not be good). –  Feb 04 '20 at 08:07
  • 1
    How is this question related to AI? You should at least mention where variational calculus can be used in AI. I know where it can be used, but you should edit your post to make this question more on-topic. – nbro Feb 04 '20 at 11:34
  • Gradient Descent is AI, and the confusing term here is 'finding minima' – Dee Feb 04 '20 at 17:32
  • 1
    @datdinhquoc GD alone is not AI. GD is an iterative optimization algorithm, which can also be used in other contexts that someone would not necessarily call AI. I will help you. You could say something like "GD is used to train neural networks and variational calculus is used in computer vision" as a premise, then ask your question. – nbro Feb 04 '20 at 22:59
  • it's maths, yes, but it's used in AI as the basic optimiser – Dee Feb 05 '20 at 03:15
  • 1
    @datdinhquoc ok. but this is related with Directional Derivatives and the Gradient Vector, max and min values from Calculus? anyone can confirm? – rubengavidia0x Sep 25 '21 at 01:59

0 Answers0