====== Multi-Task Learning ====== ===== Papers ===== ===== In NLP ===== * [[https://arxiv.org/pdf/1706.05137.pdf|Kaiser et al 2017 - One Model To Learn Them All]] Almost the same authors as the Transformer (submitted four days after the Transformer). Interesting they didn't use the Transformer. Perhaps they got the inspiration from this work. * DecaNLP: [[https://arxiv.org/pdf/1806.08730.pdf|McCann et al 2018 - The Natural Language Decathlon: Multitask Learning as Question Answering]] [[https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/slides/cs224n-2019-lecture17-multitask.pdf|slides]] [[https://decanlp.com/|dataset]] ===== Datasets ===== * NLP * [[https://decanlp.com/|DecaNLP]] [[https://arxiv.org/pdf/1806.08730.pdf|paper]] ===== Related Pages ===== * [[Meta-Learning]]