Home
Random
Nearby
Log in
Settings
Donate
About Wikidata
Disclaimers
Search
(Q62117856)
Watch
English
Cloze-driven Pretraining of Self-attention Networks
scientific article published on 19 March 2019
In more languages
edit
Statements
instance of
scholarly article
0 references
title
Cloze-driven Pretraining of Self-attention Networks
(English)
0 references
author
Luke Zettlemoyer
series ordinal
4
0 references
Michael Auli
series ordinal
5
object named as
Michael Auli
0 references
Alexei Baevski
series ordinal
1
object named as
Alexei Baevski
0 references
author name string
Sergey Edunov
series ordinal
2
0 references
Yinhan Liu
series ordinal
3
0 references
publication date
19 March 2019
0 references
full work available at URL
https://arxiv.org/pdf/1903.07785.pdf
0 references
describes a project that uses
English Wikipedia
0 references
Corpus of Linguistic Acceptability
0 references
arXiv classification
cs.CL
0 references
cites work
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
0 references
Attention Is All You Need
0 references
Identifiers
arXiv ID
1903.07785
0 references
Sitelinks
Wikipedia
(0 entries)
edit
Wikibooks
(0 entries)
edit
Wikinews
(0 entries)
edit
Wikiquote
(0 entries)
edit
Wikisource
(0 entries)
edit
Wikiversity
(0 entries)
edit
Wikivoyage
(0 entries)
edit
Wiktionary
(0 entries)
edit
Multilingual sites
(0 entries)
edit