Home
Random
Nearby
Log in
Settings
Donate
About Wikidata
Disclaimers
Search
(Q106497440)
Watch
English
RoBERTa: A Robustly Optimized BERT Pretraining Approach
scientific article
In more languages
edit
Statements
instance of
scholarly article
0 references
title
RoBERTa: A Robustly Optimized BERT Pretraining Approach
(English)
0 references
main subject
RoBERTa
0 references
author
Luke Zettlemoyer
series ordinal
9
0 references
Veselin Stoyanov
series ordinal
10
object named as
Veselin Stoyanov
0 references
Naman Goyal
series ordinal
3
object named as
Naman Goyal
0 references
author name string
Yinhan Liu
series ordinal
1
0 references
Myle Ott
series ordinal
2
0 references
Jingfei Du
series ordinal
4
0 references
Mandar Joshi
series ordinal
5
0 references
Danqi Chen
series ordinal
6
0 references
Omer Levy
series ordinal
7
0 references
Mike Lewis
series ordinal
8
0 references
publication date
26 July 2019
0 references
full work available at URL
https://arxiv.org/pdf/1907.11692.pdf
0 references
arXiv classification
cs.CL
0 references
cites work
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
0 references
Cloze-driven Pretraining of Self-attention Networks
series ordinal
2
0 references
KERMIT: Generative Insertion-Based Modeling for Sequences
series ordinal
6
0 references
Identifiers
DOI
10.48550/ARXIV.1907.11692
0 references
arXiv ID
1907.11692
0 references
Sitelinks
Wikipedia
(0 entries)
edit
Wikibooks
(0 entries)
edit
Wikinews
(0 entries)
edit
Wikiquote
(0 entries)
edit
Wikisource
(0 entries)
edit
Wikiversity
(0 entries)
edit
Wikivoyage
(0 entries)
edit
Wiktionary
(0 entries)
edit
Multilingual sites
(0 entries)
edit