🌟 When Semi-Supervised Learning Meets Transfer Learning: Training Strategies, Models and Datasets
- Combine The Semi scheme in Finetuning from the Pretrained Model
- Incorporate SSL Into Finetune
- Did Comprehensive Empirical analysis 3 Conclusion:
- The SSL Gain from full-supervised baseline(with fewer label) is smalller
- When Domain Shifts, SSL Better
- Some SSL Methods can outperform full-label training
Related Work
- SSL
- Consistency-Regularization-based
- Ladder Network
- Pi-Model
- Mean-Teacher
- GANs & Adversarial Training
- VAT - Virtual Adversrail Loss
- Entropy-based SSL
- Entropy-Minmization
- Co-Training
- Tri-Learning - Relabelling
- Consistency-Regularization-based
- Transfer Learning
- Finetune
Datasets
- Transfer from Imagenet
- Indoor (67 Scene, 6700)
- CUB200
- MURA (医学图像-骨骼的CT?)
Experiments
- 20 - Epochs
- 1k labels
- F+T Augmentation
- F - HorizontalFlip
- T - RandomTranslation
- N - Gaussian Noise
## Conclusion
- when you have enough labeled images,better pre-trained models seem to counteract the influenceof SSL methods.