BERT

arXiv V2: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

BERT, which stands for Bidirectional Encoder Representations from Transformers.


Submission history

From: Ming-Wei Chang [view email]

[v1] Thu, 11 Oct 2018 00:50:01 UTC (227 KB)

[v2] Fri, 24 May 2019 20:37:26 UTC (309 KB)