Projects per year
Abstract
Knowledge distillation enables fast and effective transfer of features learned from a bigger model to a smaller one. However, distillation objectives are susceptible to sub-population shifts, a common scenario in medical imaging analysis which refers to groups/domains of data that are underrepresented in the training set. For instance, training models on health data acquired from multiple scanners or hospitals can yield subpar performance for minority groups. In this paper, inspired by distributionally robust optimization (DRO) techniques, we address this shortcoming by proposing a group-aware distillation loss. During optimization, a set of weights is updated based on the per-group losses at a given iteration. This way, our method can dynamically focus on groups that have low performance during training. We empirically validate our method, GroupDistil on two benchmark datasets (natural images and cardiac MRIs) and show consistent improvement in terms of worst-group accuracy.
| Original language | English |
|---|---|
| Title of host publication | Machine Learning in Medical Imaging - 14th International Workshop, MLMI 2023, Held in Conjunction with MICCAI 2023, Proceedings |
| Editors | Xiaohuan Cao, Xi Ouyang, Xuanang Xu, Islem Rekik, Zhiming Cui |
| Publisher | Springer |
| Pages | 234-242 |
| Number of pages | 9 |
| ISBN (Print) | 9783031456756 |
| DOIs | |
| Publication status | Published - 15 Oct 2023 |
| Event | 14th International Workshop on Machine Learning in Medical Imaging, MLMI 2023 - Vancouver, Canada Duration: 8 Oct 2023 → 8 Oct 2023 |
Publication series
| Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
|---|---|
| Volume | 14349 LNCS |
| ISSN (Print) | 0302-9743 |
| ISSN (Electronic) | 1611-3349 |
Conference
| Conference | 14th International Workshop on Machine Learning in Medical Imaging, MLMI 2023 |
|---|---|
| Country/Territory | Canada |
| City | Vancouver |
| Period | 8/10/23 → 8/10/23 |
Keywords / Materials (for Non-textual outputs)
- Classification
- Invariance
- Knowledge Distillation
- Sub-population Shift
Fingerprint
Dive into the research topics of 'Group Distributionally Robust Knowledge Distillation'. Together they form a unique fingerprint.-
Canon Medical / RAEng Senior Research Fellow in Healthcare AI
Tsaftaris, S. (Principal Investigator)
31/03/19 → 30/06/26
Project: Research
-
From trivial representations to learning concepts in AI by exploiting unique data
Tsaftaris, S. (Principal Investigator)
Engineering and Physical Sciences Research Council
1/02/23 → 16/05/25
Project: Research