1

What is the consensus regarding NN "capacity" or expressive power? I remember reading somewhere that expressive power grows exponentially with depth, but I cannot seem to find that exact paper.

If I have some dataset and some neural network, is there some heuristic, theorem, or result that may help me to decide whether this particular neural network I've chosen has enough expressive power (or capacity) to learn that data?

Something similar to VC-dimension and VC-inequality, but regarding more complex models such as NNs.

I suspect that there is no simple answer, but, generally, what would be the answer to this question?

Overfitting on some subset might be a start, but that doesn't really tell me how the model behaves when there's more data, it only tells me that it's not fundamentally broken and can learn something.

I know it's a complex matter, but I'll be grateful for any help, be it some practical stuff, as well as some references, papers, etc. Of course, I googled some papers, but if you have something particularly interesting, please share.

nbro
  • 39,006
  • 12
  • 98
  • 176
gabe
  • 11
  • 1
  • What is your main **specific** question? Is it 1. _What is the consensus regarding NN "capacity" or expressive power?_ (consensus?) or 2. _what if I have some dataset and some model. Is there some heuristic, theorem or result that may help me to decide whether this particualr model I've chosen has enough expressive power to learn that data?_ Moreover, note that the VC dimension has also be calculated for neural networks. Moreover, a very similar question has already been asked in the past: https://ai.stackexchange.com/q/18220/2444. Please, tell us how your question is different. – nbro Nov 13 '20 at 19:51

0 Answers0