2

Unsuccessfully, I tried to find out the "depth" (definition below) in large neural networks such as GPT-3, AlphaFold 2, and DALL-E 2.

Formally, my question is about their computational graph: consider a path from some node (a.k.a. neuron) to another. The length of a path is the number of its edges.

What is the longest path from an input node to an output node that visits unique nodes at most once?

I would appreciate any answer/reference regarding large networks like those mentioned above.

nbro
  • 39,006
  • 12
  • 98
  • 176
  • Please, **focus on 1 architecture per post.** So, edit your post to ask just about one architecture. If you have questions about other models, ask them in a separate post. If you're interested in a reference that summarises results about multiple models, then I would change the title not to be so specific and to actually emphasize that you're looking for some reference. – nbro Apr 08 '22 at 16:55
  • Having said that, it seems that you're asking 2 distinct questions here: 1. "What is the longest path from an input node to an output node that visits unique nodes at most once?" and 2. "How many layers do GPT-3, AlphaFold 2, and DALL-E 2 have?". Please, **only 1 question per post.** If they are very related or the same question, explain why. – nbro Apr 12 '22 at 14:23

0 Answers0