Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks

Knowledge transfer using convolutional neural networks (CNNs) can help efficiently train a CNN with fewer parameters or maximize the generalization performance under limited supervision. To enable a more efficient transfer of pretrained knowledge under relaxed conditions, we propose a simple yet pow...

Full description

Bibliographic Details
Main Authors: Hong, Seungbum, Yoon, Jihun, Kim, Junmo, Choi, Min-Kook
Format: Article in Journal/Newspaper
Language:unknown
Published: arXiv 2021
Subjects:
DML
Online Access:https://dx.doi.org/10.48550/arxiv.2110.12696
https://arxiv.org/abs/2110.12696
id ftdatacite:10.48550/arxiv.2110.12696
record_format openpolar
spelling ftdatacite:10.48550/arxiv.2110.12696 2023-05-15T16:02:00+02:00 Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks Hong, Seungbum Yoon, Jihun Kim, Junmo Choi, Min-Kook 2021 https://dx.doi.org/10.48550/arxiv.2110.12696 https://arxiv.org/abs/2110.12696 unknown arXiv Creative Commons Attribution 4.0 International https://creativecommons.org/licenses/by/4.0/legalcode cc-by-4.0 CC-BY Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Article CreativeWork article Preprint 2021 ftdatacite https://doi.org/10.48550/arxiv.2110.12696 2022-03-10T14:00:00Z Knowledge transfer using convolutional neural networks (CNNs) can help efficiently train a CNN with fewer parameters or maximize the generalization performance under limited supervision. To enable a more efficient transfer of pretrained knowledge under relaxed conditions, we propose a simple yet powerful knowledge transfer methodology without any restrictions regarding the network structure or dataset used, namely self-supervised knowledge transfer (SSKT), via loosely supervised auxiliary tasks. For this, we devise a training methodology that transfers previously learned knowledge to the current training process as an auxiliary task for the target task through self-supervision using a soft label. The SSKT is independent of the network structure and dataset, and is trained differently from existing knowledge transfer methods; hence, it has an advantage in that the prior knowledge acquired from various tasks can be naturally transferred during the training process to the target task. Furthermore, it can improve the generalization performance on most datasets through the proposed knowledge transfer between different problem domains from multiple source networks. SSKT outperforms the other transfer learning methods (KD, DML, and MAXL) through experiments under various knowledge transfer settings. The source code will be made available to the public. : Accepted at WACV 2022 Article in Journal/Newspaper DML DataCite Metadata Store (German National Library of Science and Technology)
institution Open Polar
collection DataCite Metadata Store (German National Library of Science and Technology)
op_collection_id ftdatacite
language unknown
topic Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
spellingShingle Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
Hong, Seungbum
Yoon, Jihun
Kim, Junmo
Choi, Min-Kook
Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks
topic_facet Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
description Knowledge transfer using convolutional neural networks (CNNs) can help efficiently train a CNN with fewer parameters or maximize the generalization performance under limited supervision. To enable a more efficient transfer of pretrained knowledge under relaxed conditions, we propose a simple yet powerful knowledge transfer methodology without any restrictions regarding the network structure or dataset used, namely self-supervised knowledge transfer (SSKT), via loosely supervised auxiliary tasks. For this, we devise a training methodology that transfers previously learned knowledge to the current training process as an auxiliary task for the target task through self-supervision using a soft label. The SSKT is independent of the network structure and dataset, and is trained differently from existing knowledge transfer methods; hence, it has an advantage in that the prior knowledge acquired from various tasks can be naturally transferred during the training process to the target task. Furthermore, it can improve the generalization performance on most datasets through the proposed knowledge transfer between different problem domains from multiple source networks. SSKT outperforms the other transfer learning methods (KD, DML, and MAXL) through experiments under various knowledge transfer settings. The source code will be made available to the public. : Accepted at WACV 2022
format Article in Journal/Newspaper
author Hong, Seungbum
Yoon, Jihun
Kim, Junmo
Choi, Min-Kook
author_facet Hong, Seungbum
Yoon, Jihun
Kim, Junmo
Choi, Min-Kook
author_sort Hong, Seungbum
title Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks
title_short Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks
title_full Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks
title_fullStr Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks
title_full_unstemmed Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks
title_sort self-supervised knowledge transfer via loosely supervised auxiliary tasks
publisher arXiv
publishDate 2021
url https://dx.doi.org/10.48550/arxiv.2110.12696
https://arxiv.org/abs/2110.12696
genre DML
genre_facet DML
op_rights Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
cc-by-4.0
op_rightsnorm CC-BY
op_doi https://doi.org/10.48550/arxiv.2110.12696
_version_ 1766397647490383872