Enhanced ProtoNet With Self-Knowledge Distillation for Few-Shot Learning


Habib M. E. H., KÜÇÜKMANİSA A., URHAN O.

IEEE ACCESS, cilt.12, ss.145331-145340, 2024 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 12
  • Basım Tarihi: 2024
  • Doi Numarası: 10.1109/access.2024.3472530
  • Dergi Adı: IEEE ACCESS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Compendex, INSPEC, Directory of Open Access Journals
  • Sayfa Sayıları: ss.145331-145340
  • Kocaeli Üniversitesi Adresli: Evet

Özet

Few-Shot Learning (FSL) has recently gained increased attention for its effectiveness in addressing the problem of data scarcity. Many approaches have been proposed based on the FSL idea, including prototypical networks (ProtoNet). ProtoNet demonstrates its effectiveness in overcoming this issue while providing simplicity in its architecture. On the other hand, the self-knowledge distillation (SKD) technique has become popular in assisting FSL models in achieving good performance by transferring knowledge gained from additional training data. In this work, we apply the self-knowledge distillation technique to ProtoNet to boost its performance. For each task, we compute the prototypes from the few examples (local prototypes) and the many examples (global prototypes) and use the global prototypes to distill knowledge to the few-shot learner model. We employ different distillation techniques based on prototypes, logits, and predictions (soft labels). We evaluated our method using three popular FSL image classification benchmark datasets: CIFAR-FS, CIFAR-FC100, and miniImageNet. Our approach outperformed the baseline and achieved competitive results compared to the state-of-the-art methods, especially on the CIFAR-FC100 dataset.