Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accuracy with or without finetune #35

Open
AdventureStory opened this issue Nov 23, 2022 · 2 comments
Open

Accuracy with or without finetune #35

AdventureStory opened this issue Nov 23, 2022 · 2 comments

Comments

@AdventureStory
Copy link

Hi, thanks for your great code, I am confused about accuracy with or without finetune on CIFAR-10-4K.
Results show that accuracy with and without finetune on CIFAR-10-4K are 96.01% and 96.08%, which means there is no significant increase of accuracy with finetune. And I wonder is it means we could use only unlabeled data (with pseudo label generated by teacher model) to train our student model and obtain accuracy almost equal to train it with labeled data? So it indicates indirectly that pseudo labels generated by teacher model are high-quality?
I would appreciated if you could give me a response. Thanks!

@wowotoushiwo
Copy link

I am also puzzled by this question.

@kekmodel
Copy link
Owner

kekmodel commented Dec 11, 2022

So it indicates indirectly that pseudo labels generated by teacher model are high-quality?

This is because all training data is used for unlabeled data. I agree with this opinion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants