1

Training neural networks takes a while. My question is, how efficient is a neural network that is completely trained (assuming it's not a model that is constantly learning)?

I understand that this is a vague and simply difficult question to answer, so let me be more specific: Imagine we have a trained Deep Neural Net, and even to be more specific it's a GPT-3 model.

Now, we put the whole thing on a Raspberry Pi. No internet access. The whole process takes place locally.

  • Will it run at all? Will it have enough RAM?

  • Now let's say we give it some text to analyze. Then we ask it a question. Will it take milliseconds to answer? Or is it going to be in the seconds? Minutes?

What I'm trying to understand, once a model is trained is it fairly performant because it's just essentially a bunch of very simple function calls on top of each other, or is it very heavy to execute? (perhaps due to the sheer number of these simple function calls)

Please correct any misunderstanding about how the whole process works if you spot any. Thank you.

nbro
  • 39,006
  • 12
  • 98
  • 176
Anton
  • 111
  • 2
  • 1
    There's this article detailing why some models aren't suitable for practical applications: https://towardsdatascience.com/too-big-to-deploy-how-gpt-2-is-breaking-production-63ab29f0897c – Anton Sep 04 '20 at 14:26
  • It seems to me that you're asking how [**efficient**](https://en.wikipedia.org/wiki/Algorithmic_efficiency) using a trained model is. Performance refers to how good the answers are in terms of correctness. So, I suggest that you edit your post to clarify this. – nbro Sep 04 '20 at 15:03
  • 1
    The GPT-3 model specifically is a monster, it requires powerful machine with over 350GB of RAM just to run inferences (and probably mulit-core processors too). There are plenty of others which are smaller, and likely there will be some that could run on a Rapberry Pi. So to prevent the answer being "it depends", could you clarify your use case? Sadly, there is no general answer here, and also if you want to run GPT-3 on any consumer hardware you are out of luck for a few years yet – Neil Slater Sep 04 '20 at 15:43
  • @NeilSlater Your comment answers my question. We wrote a custom albeit simple NLP processor and were wondering if we made the wrong decision instead of using GPT-3. Your answer clarifies that we didn't, because we would not be able to run such a model with our resources. – Anton Sep 04 '20 at 16:41
  • 1
    @Anton: No problem. There are smaller versions of GPT-3 mind . . . I quoted stats for the largest one, which also has the best accuracy. I looked it up after the comment and saw there were options to use smaller variants that use the same approach but are less accurate. I don't know if the smallest one would fit on a Raspberry Pi, or would be useful for your project. – Neil Slater Sep 04 '20 at 17:31

0 Answers0