Abstract / Description of output
Multi-task learning (MTL) is to learn one single model that performs multiple tasks for achieving good performance on all tasks and lower cost on computation. Learning such a model requires to jointly optimize losses of a set of tasks with different difficulty levels, magnitudes, and characteristics (e.g. cross-entropy, Euclidean loss), leading to the imbalance problem in multi-task learning. To address the imbalance problem, we propose a knowledge distillation based method in this work. We first learn a task-specific model for each task. We then learn the multi-task model for minimizing task-specific loss and for producing the same feature with task-specific models. As the task-specific network encodes different features, we introduce small task-specific adaptors to project multi-task features to the task-specific features. In this way, the adaptors align the task-specific feature and the multi-task feature, which enables a balanced parameter sharing across tasks. Extensive experimental results demonstrate that our method can optimize a multi-task learning model in a more balanced way and achieve better overall performance.
Original language | English |
---|---|
Title of host publication | Computer Vision – ECCV 2020 Workshops |
Editors | Adrien Bartoli, Andrea Fusiello |
Publisher | Springer |
Pages | 163-176 |
Number of pages | 14 |
ISBN (Electronic) | 978-3-030-65414-6 |
ISBN (Print) | 978-3-030-65413-9 |
DOIs | |
Publication status | Published - 5 Jan 2021 |
Event | Workshops held at the 16th European Conference on Computer Vision - Glasgow, United Kingdom Duration: 23 Aug 2020 → 28 Aug 2020 https://eccv2020.eu |
Publication series
Name | Lecture Notes in Computer Science |
---|---|
Publisher | Springer |
Volume | 12540 |
ISSN (Print) | 0302-9743 |
ISSN (Electronic) | 1611-3349 |
Conference
Conference | Workshops held at the 16th European Conference on Computer Vision |
---|---|
Abbreviated title | ECCV 2020 |
Country/Territory | United Kingdom |
City | Glasgow |
Period | 23/08/20 → 28/08/20 |
Internet address |