I am trying to understand the concept of fine-tuning and few-shot learning.
I understand the need for fine-tuning. It is essentially tuning a pre-trained model to a specific downstream task.
However, recently I have seen a plethora of blog posts stating zero-shot learning, one-shot learning and few-shot learning.
How are they different from fine-tuning? It appears to me that few-shot learning is a specialization of fine-tuning. What am I missing here?
Can anyone please help me?