You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After semi-supervised pretraining, can we do light-weighted fine-tunning or few-shot learning instead of classification?
What is the expected behavior?
Instead of fine-tuning on decent amount of labeled data. Is it possible to do some light-weight fine-tuning (e.g., fine-tunning on less than 100 labeled data) or doing few-shot learning instead of classification?
What is motivation or use case for adding/changing the behavior?
Only have limited labeled data to fine-tune model.
How should this be implemented in your opinion?
For few-shot learning, maybe change the loss function.
Are you willing to work on this yourself?
yes
The text was updated successfully, but these errors were encountered:
Feature request
After semi-supervised pretraining, can we do light-weighted fine-tunning or few-shot learning instead of classification?
What is the expected behavior?
Instead of fine-tuning on decent amount of labeled data. Is it possible to do some light-weight fine-tuning (e.g., fine-tunning on less than 100 labeled data) or doing few-shot learning instead of classification?
What is motivation or use case for adding/changing the behavior?
Only have limited labeled data to fine-tune model.
How should this be implemented in your opinion?
For few-shot learning, maybe change the loss function.
Are you willing to work on this yourself?
yes
The text was updated successfully, but these errors were encountered: