ml:meta-learning

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
ml:meta-learning [2022/05/16 06:21] – [Deep Learning Papers] jmflanigml:meta-learning [2023/11/09 19:44] (current) – [Related Pages] jmflanig
Line 13: Line 13:
   * [[https://arxiv.org/pdf/1606.04474.pdf|Andrychowicz et al 2016 - Learning to Learn by Gradient Descent by Gradient Descent]]   * [[https://arxiv.org/pdf/1606.04474.pdf|Andrychowicz et al 2016 - Learning to Learn by Gradient Descent by Gradient Descent]]
   * [[https://arxiv.org/pdf/1609.09106.pdf|Ha et al 2016 - HyperNetworks]]   * [[https://arxiv.org/pdf/1609.09106.pdf|Ha et al 2016 - HyperNetworks]]
 +  * [[https://openreview.net/pdf?id=rJY0-Kcll|Ravi & Larochelle 2016 - Optimization as a Model for Few-Shot Learning]] Proposes "an LSTM-based meta-learner model to learn the exact optimization algorithm used to train another learner neural network classifier in the few-shot regime."
   * [[http://proceedings.mlr.press/v70/chen17e/chen17e.pdf|Chen 2017 - Learning to Learn without Gradient Descent by Gradient Descent]] Learns a black-box optimizer (gradient-free optimizer).  Can be applied to hyperparameter tuning.   * [[http://proceedings.mlr.press/v70/chen17e/chen17e.pdf|Chen 2017 - Learning to Learn without Gradient Descent by Gradient Descent]] Learns a black-box optimizer (gradient-free optimizer).  Can be applied to hyperparameter tuning.
   * MAML: [[https://arxiv.org/pdf/1703.03400.pdf|Finn et al 2017 - Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks]]   * MAML: [[https://arxiv.org/pdf/1703.03400.pdf|Finn et al 2017 - Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks]]
-  * [[http://proceedings.mlr.press/v97/finn19a/finn19a.pdf|Finn et al 2019 - Online Meta-Learning]] Introduces online meta-learning, where the learner sees a series of tasks. Proposes Follow The Meta-Leader (FTML), a follow-the-leader-type meta learning algorithm, which extend MAML to this setting.+  * [[http://proceedings.mlr.press/v97/finn19a/finn19a.pdf|Finn et al 2019 - Online Meta-Learning]] Introduces online meta-learning, where the learner sees a series of tasks. Proposes Follow The Meta-Leader (FTML), a follow-the-leader-type meta-learning algorithm, which extend MAML to this setting.
   * [[https://arxiv.org/pdf/2012.14905.pdf|Kirsch & Schmidhuber 2021 - Meta Learning Backpropagation And Improving It]]   * [[https://arxiv.org/pdf/2012.14905.pdf|Kirsch & Schmidhuber 2021 - Meta Learning Backpropagation And Improving It]]
   * [[https://arxiv.org/pdf/2201.04182.pdf|Zhmoginov et al 2022 - HyperTransformer: Model Generation for Supervised and Semi-Supervised Few-Shot Learning]] ([[https://www.youtube.com/watch?v=D6osiiEoV0w&ab_channel=YannicKilcher|interview]])   * [[https://arxiv.org/pdf/2201.04182.pdf|Zhmoginov et al 2022 - HyperTransformer: Model Generation for Supervised and Semi-Supervised Few-Shot Learning]] ([[https://www.youtube.com/watch?v=D6osiiEoV0w&ab_channel=YannicKilcher|interview]])
Line 25: Line 26:
 ===== Meta-Learning in NLP ===== ===== Meta-Learning in NLP =====
   * [[https://arxiv.org/pdf/1911.03863.pdf|Bansal et al 2019 - Learning to Few-Shot Learn Across Diverse Natural Language Classification Tasks]]   * [[https://arxiv.org/pdf/1911.03863.pdf|Bansal et al 2019 - Learning to Few-Shot Learn Across Diverse Natural Language Classification Tasks]]
 +  * [[https://aclanthology.org/2021.naacl-main.88.pdf|Murty et al 2021 - DReCa: A General Task Augmentation Strategy for Few-Shot Natural Language Inference]]
   * [[https://arxiv.org/pdf/2111.01322.pdf|Bansal et al 2021 - Diverse Distributions of Self-Supervised Tasks for Meta-Learning in NLP]]   * [[https://arxiv.org/pdf/2111.01322.pdf|Bansal et al 2021 - Diverse Distributions of Self-Supervised Tasks for Meta-Learning in NLP]]
  
 ===== Related Pages ===== ===== Related Pages =====
 +  * [[application_optimization|Application: Optimization]]
 +  * [[Multi-Task Learning]]
   * [[Neural Architecture Search]]   * [[Neural Architecture Search]]
 +  * [[nlp:prompting|Prompting and In-Context Learning]]
  
ml/meta-learning.1652682060.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki