For questions about pre-text tasks (also known as auxiliary tasks) in the context of self-supervised learning.
Questions tagged [pretext-tasks]
4 questions
3
votes
1 answer
Does self-supervised learning require auxiliary tasks?
Self-supervised learning algorithms provide labels automatically. But, it is not clear what else is required for an algorithm to fall under the category "self-supervised":
Some say, self-supervised learning algorithms learn on a set of auxiliary…

Make42
- 153
- 5
2
votes
1 answer
Is it possible to use self-supervised learning on different images for the pretext and downstream tasks?
I have just come across the idea of self-supervised learning. It seems that it is possible to get higher accuracies on downstream tasks when the network is trained on pretext tasks.
Suppose that I want to do image classification on my own set of…

calveeen
- 1,251
- 7
- 17
2
votes
1 answer
Is it possible to pre-train a CNN in a self-supervised way so that it can later be used to solve an instance segmentation task?
I would like to use self-supervised learning (SSL) to learn features from images (the dataset consists of similar images with small differences), then use the resulting trained model to bootstrap an instance segmentation task.
I am thinking about…

Timco Vanco
- 21
- 3
0
votes
1 answer
Self Supervised Learning Application of trained model... A bit confused
I am trying to apply a self supervised task as stated in this github repo.The Self-Supervised Sketch Recognition
In this work, authors are using 345.000 image samples to train the model and the dataset is constructed by rotating the images in…

T.Gulez
- 1
- 1