Facets (new session)
Description
Metadata
Settings
owl:sameAs
Inference Rule:
b3s
b3sifp
dbprdf-label
facets
http://dbpedia.org/resource/inference/rules/dbpedia#
http://dbpedia.org/resource/inference/rules/opencyc#
http://dbpedia.org/resource/inference/rules/umbel#
http://dbpedia.org/resource/inference/rules/yago#
http://dbpedia.org/schema/property_rules#
http://www.ontologyportal.org/inference/rules/SUMO#
http://www.ontologyportal.org/inference/rules/WordNet#
http://www.w3.org/2002/07/owl#
ldp
oplweb
skos-trans
virtrdf-label
None
About:
Transfer Learning or Self-supervised Learning? A Tale of Two Pretraining Paradigms
Goto
Sponge
NotDistinct
Permalink
An Entity of Type :
schema:ScholarlyArticle
, within Data Space :
covidontheweb.inria.fr
associated with source
document(s)
Type:
Academic Article
research paper
schema:ScholarlyArticle
New Facet based on Instances of this Class
Attributes
Values
type
Academic Article
research paper
schema:ScholarlyArticle
isDefinedBy
Covid-on-the-Web dataset
title
Transfer Learning or Self-supervised Learning? A Tale of Two Pretraining Paradigms
Creator
Yang, Yue
He, Xuehai
Xie, Pengtao
Yang, Xingyi
Liang, Yuxiao
»more»
source
ArXiv
abstract
Pretraining has become a standard technique in computer vision and natural language processing, which usually helps to improve performance substantially. Previously, the most dominant pretraining method is transfer learning (TL), which uses labeled data to learn a good representation network. Recently, a new pretraining approach -- self-supervised learning (SSL) -- has demonstrated promising results on a wide range of applications. SSL does not require annotated labels. It is purely conducted on input data by solving auxiliary tasks defined on the input data examples. The current reported results show that in certain applications, SSL outperforms TL and the other way around in other applications. There has not been a clear understanding on what properties of data and tasks render one approach outperforms the other. Without an informed guideline, ML researchers have to try both methods to find out which one is better empirically. It is usually time-consuming to do so. In this work, we aim to address this problem. We perform a comprehensive comparative study between SSL and TL regarding which one works better under different properties of data and tasks, including domain difference between source and target tasks, the amount of pretraining data, class imbalance in source data, and usage of target data for additional pretraining, etc. The insights distilled from our comparative studies can help ML researchers decide which method to use based on the properties of their applications.
has issue date
2020-06-19
(
xsd:dateTime
)
has license
arxiv
sha1sum (hex)
807df832801763af55419262aadf7cc22d196901
resource representing a document's title
Transfer Learning or Self-supervised Learning? A Tale of Two Pretraining Paradigms
resource representing a document's body
covid:807df832801763af55419262aadf7cc22d196901#body_text
is
schema:about
of
named entity 'target'
named entity 'Preprint'
named entity 'problem'
named entity 'wide range'
named entity 'Learning'
»more»
◂◂ First
◂ Prev
Next ▸
Last ▸▸
Page 1 of 5
Go
Faceted Search & Find service v1.13.91 as of Mar 24 2020
Alternative Linked Data Documents:
Sponger
|
ODE
Content Formats:
RDF
ODATA
Microdata
About
OpenLink Virtuoso
version 07.20.3229 as of Jul 10 2020, on Linux (x86_64-pc-linux-gnu), Single-Server Edition (94 GB total memory)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2025 OpenLink Software