Table of Contents
Meta-Learning
Overviews
AutoML
Deep Learning Papers
Non-Deep Learning Papers
Meta-Learning in NLP
Related Pages
Meta-Learning
Overviews
Hospedales et al 2020 - Meta-Learning in Neural Networks: A Survey
AutoML
2019 - AutoML: A Survey of the State-of-the-Art
2019 - Benchmark and Survey of Automated Machine Learning Frameworks
2020 - Automated Machine Learning: The New Wave of Machine Learning
Deep Learning Papers
Hochreiter & Younger 2001 - Learning to Learn Using Gradient Descent
Amazing paper from one of the inventors of LSTMs. Jeurgen talks about it
here
.
Andrychowicz et al 2016 - Learning to Learn by Gradient Descent by Gradient Descent
Ha et al 2016 - HyperNetworks
Ravi & Larochelle 2016 - Optimization as a Model for Few-Shot Learning
Proposes “an LSTM-based meta-learner model to learn the exact optimization algorithm used to train another learner neural network classifier in the few-shot regime.”
Chen 2017 - Learning to Learn without Gradient Descent by Gradient Descent
Learns a black-box optimizer (gradient-free optimizer). Can be applied to hyperparameter tuning.
MAML:
Finn et al 2017 - Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Finn et al 2019 - Online Meta-Learning
Introduces online meta-learning, where the learner sees a series of tasks. Proposes Follow The Meta-Leader (FTML), a follow-the-leader-type meta-learning algorithm, which extend MAML to this setting.
Kirsch & Schmidhuber 2021 - Meta Learning Backpropagation And Improving It
Zhmoginov et al 2022 - HyperTransformer: Model Generation for Supervised and Semi-Supervised Few-Shot Learning
(
interview
)
Non-Deep Learning Papers
Real et al 2020 - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch
Peng et al 2021 - PyGlove: Symbolic Programming for Automated Machine Learning
Meta-Learning in NLP
Bansal et al 2019 - Learning to Few-Shot Learn Across Diverse Natural Language Classification Tasks
Murty et al 2021 - DReCa: A General Task Augmentation Strategy for Few-Shot Natural Language Inference
Bansal et al 2021 - Diverse Distributions of Self-Supervised Tasks for Meta-Learning in NLP
Related Pages
Application: Optimization
Multi-Task Learning
Neural Architecture Search
Prompting and In-Context Learning