User Tools

Site Tools


nlp:bert_and_friends

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
nlp:bert_and_friends [2022/07/27 22:45] – [BERT] jmflanignlp:bert_and_friends [2023/07/06 00:22] (current) – [Introductions to BERT] jmflanig
Line 1: Line 1:
 ====== BERT ====== ====== BERT ======
-Introductions to BERT+ 
 +===== Introductions to BERT =====
   * Paper: [[https://arxiv.org/pdf/1810.04805.pdf|Devlin et al 2018 - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding]]   * Paper: [[https://arxiv.org/pdf/1810.04805.pdf|Devlin et al 2018 - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding]]
   * Blogs   * Blogs
Line 9: Line 10:
   * Training from scratch   * Training from scratch
     * [[https://aclanthology.org/2021.emnlp-main.831.pdf|Izsak et al 2021 - How to Train BERT with an Academic Budget]]     * [[https://aclanthology.org/2021.emnlp-main.831.pdf|Izsak et al 2021 - How to Train BERT with an Academic Budget]]
 +  * Retrospective Analyssis
 +    * [[https://arxiv.org/pdf/2306.02870.pdf|Nityasya et al 2023 - On “Scientific Debt” in NLP: A Case for More Rigour in Language Model Pre-Training Research]]
  
 ===== Extensions ===== ===== Extensions =====
Line 21: Line 24:
   * [[https://arxiv.org/pdf/1905.05950.pdf|Tenney et al 2019 - BERT Rediscovers the Classical NLP Pipeline]]   * [[https://arxiv.org/pdf/1905.05950.pdf|Tenney et al 2019 - BERT Rediscovers the Classical NLP Pipeline]]
   * [[https://arxiv.org/pdf/2002.12327.pdf|Rogers et al 2020 - A Primer in BERTology: What we know about how BERT works]]   * [[https://arxiv.org/pdf/2002.12327.pdf|Rogers et al 2020 - A Primer in BERTology: What we know about how BERT works]]
 +  * [[https://twitter.com/lvwerra/status/1485301457813487619?s=21|2022 - Visualization of position embeddings in BERT and GPT-2 (Twitter)]]
   * [[https://arxiv.org/pdf/2203.06204.pdf|Papadimitriou et al 2022 - When classifying grammatical role, BERT doesn’t care about word order... except when it matters]]   * [[https://arxiv.org/pdf/2203.06204.pdf|Papadimitriou et al 2022 - When classifying grammatical role, BERT doesn’t care about word order... except when it matters]]
 +
 +
  
 ===== Applications ===== ===== Applications =====
Line 34: Line 40:
 ===== Other Variants ===== ===== Other Variants =====
   * [[https://arxiv.org/pdf/1909.05840.pdf|Shen et al 2019 - Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT]]   * [[https://arxiv.org/pdf/1909.05840.pdf|Shen et al 2019 - Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT]]
 +  * [[https://arxiv.org/pdf/1910.01108.pdf|Sanh et al 2019 - DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter]]
  
 ===== Related Pages ===== ===== Related Pages =====
nlp/bert_and_friends.1658961916.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki