DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations

Read paper: https://www.aclanthology.org/2021.acl-long.72 Abstract: Sentence embeddings are an important component of many natural language processing (NLP) systems. Like word embeddings, sentence embeddings are typically learned on large text corpora and then transferred to various downstream tasks...

Full description

Bibliographic Details
Main Authors: The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021, Bader, Gary, Giorgi, John, Nitski, Osvald, Wang, Bo
Format: Article in Journal/Newspaper
Language:unknown
Published: Underline Science Inc. 2021
Subjects:
DML
Online Access:https://dx.doi.org/10.48448/kfq5-k490
https://underline.io/lecture/25949-declutr-deep-contrastive-learning-for-unsupervised-textual-representations
id ftdatacite:10.48448/kfq5-k490
record_format openpolar
spelling ftdatacite:10.48448/kfq5-k490 2023-05-15T16:01:59+02:00 DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021 Bader, Gary Giorgi, John Nitski, Osvald Wang, Bo 2021 https://dx.doi.org/10.48448/kfq5-k490 https://underline.io/lecture/25949-declutr-deep-contrastive-learning-for-unsupervised-textual-representations unknown Underline Science Inc. Neural Network Electromagnetism Computational Linguistics Condensed Matter Physics FOS Physical sciences Deep Learning Semantics Information and Knowledge Engineering Audiovisual article MediaObject Conference talk 2021 ftdatacite https://doi.org/10.48448/kfq5-k490 2022-02-08T17:44:38Z Read paper: https://www.aclanthology.org/2021.acl-long.72 Abstract: Sentence embeddings are an important component of many natural language processing (NLP) systems. Like word embeddings, sentence embeddings are typically learned on large text corpora and then transferred to various downstream tasks, such as clustering and retrieval. Unlike word embeddings, the highest performing solutions for learning sentence embeddings require labelled data, limiting their usefulness to languages and domains where labelled data is abundant. In this paper, we present DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations. Inspired by recent advances in deep metric learning (DML), we carefully design a self-supervised objective for learning universal sentence embeddings that does not require labelled training data. When used to extend the pretraining of transformer-based language models, our approach closes the performance gap between unsupervised and supervised pretraining for universal sentence encoders. Importantly, our experiments suggest that the quality of the learned embeddings scale with both the number of trainable parameters and the amount of unlabelled training data. Our code and pretrained models are publicly available and can be easily adapted to new domains or used to embed unseen text. Article in Journal/Newspaper DML DataCite Metadata Store (German National Library of Science and Technology)
institution Open Polar
collection DataCite Metadata Store (German National Library of Science and Technology)
op_collection_id ftdatacite
language unknown
topic Neural Network
Electromagnetism
Computational Linguistics
Condensed Matter Physics
FOS Physical sciences
Deep Learning
Semantics
Information and Knowledge Engineering
spellingShingle Neural Network
Electromagnetism
Computational Linguistics
Condensed Matter Physics
FOS Physical sciences
Deep Learning
Semantics
Information and Knowledge Engineering
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
Bader, Gary
Giorgi, John
Nitski, Osvald
Wang, Bo
DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
topic_facet Neural Network
Electromagnetism
Computational Linguistics
Condensed Matter Physics
FOS Physical sciences
Deep Learning
Semantics
Information and Knowledge Engineering
description Read paper: https://www.aclanthology.org/2021.acl-long.72 Abstract: Sentence embeddings are an important component of many natural language processing (NLP) systems. Like word embeddings, sentence embeddings are typically learned on large text corpora and then transferred to various downstream tasks, such as clustering and retrieval. Unlike word embeddings, the highest performing solutions for learning sentence embeddings require labelled data, limiting their usefulness to languages and domains where labelled data is abundant. In this paper, we present DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations. Inspired by recent advances in deep metric learning (DML), we carefully design a self-supervised objective for learning universal sentence embeddings that does not require labelled training data. When used to extend the pretraining of transformer-based language models, our approach closes the performance gap between unsupervised and supervised pretraining for universal sentence encoders. Importantly, our experiments suggest that the quality of the learned embeddings scale with both the number of trainable parameters and the amount of unlabelled training data. Our code and pretrained models are publicly available and can be easily adapted to new domains or used to embed unseen text.
format Article in Journal/Newspaper
author The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
Bader, Gary
Giorgi, John
Nitski, Osvald
Wang, Bo
author_facet The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
Bader, Gary
Giorgi, John
Nitski, Osvald
Wang, Bo
author_sort The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
title DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
title_short DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
title_full DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
title_fullStr DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
title_full_unstemmed DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
title_sort declutr: deep contrastive learning for unsupervised textual representations
publisher Underline Science Inc.
publishDate 2021
url https://dx.doi.org/10.48448/kfq5-k490
https://underline.io/lecture/25949-declutr-deep-contrastive-learning-for-unsupervised-textual-representations
genre DML
genre_facet DML
op_doi https://doi.org/10.48448/kfq5-k490
_version_ 1766397634674688000