Home
Random
Nearby
Log in
Settings
Donate
About Wikidata
Disclaimers
Search
(Q85124095)
Watch
English
RoBERTa
deep learning neural network for natural language processing
In more languages
default for all languages
No label defined
No description defined
edit
Statements
instance of
language model
1 reference
reference URL
https://ai.facebook.com/blog/roberta-an-optimized-method-for-pretraining-self-supervised-nlp-systems/
retrieved
30 May 2021
subclass of
Bidirectional Encoder Representations from Transformer
0 references
described at URL
https://pytorch.org/hub/pytorch_fairseq_roberta/
language of work or name
English
0 references
described by source
RoBERTa: A Robustly Optimized BERT Pretraining Approach
0 references
Sitelinks
Wikipedia
(0 entries)
edit
Wikibooks
(0 entries)
edit
Wikinews
(0 entries)
edit
Wikiquote
(0 entries)
edit
Wikisource
(0 entries)
edit
Wikiversity
(0 entries)
edit
Wikivoyage
(0 entries)
edit
Wiktionary
(0 entries)
edit
Multilingual sites
(0 entries)
edit