3

I have successfully trained a Yolo model to recognize k classes. Now I want to train by adding k+1 class to the pre-trained weights (k classes) without forgetting previous k classes. Ideally, I want to keep adding classes and train over the previous weights, i.e., train only the new classes. If I have to train all classes (k+1) every time a new class is added, it would be too time-consuming, as training k classes would take $k*20000$ iterations, versus the $20000$ iterations per new class if I can add the classes incrementally.

The dataset is balanced (5000 images per classes for training).

I appreciated if you can throw some methods or techniques to do this continual training for Yolo.

nbro
  • 39,006
  • 12
  • 98
  • 176
Troy
  • 73
  • 4

1 Answers1

0

There's something called Elastic Weight Consolidation to prevent neural networks from forgetting previous tasks as they train on new tasks. It might be helpful for your case too.

The main idea is to quantify the importance of parameters for task $t$ and penalize the model in proportion when it changes its parameters as it trains to learn task $t+1$. As you can see, this incentivizes model to change parameters that are less important for task $t$ which prevents the model from forgetting it.

SpiderRico
  • 960
  • 8
  • 18