====== Supertasks ====== I believe Richard Socher may have introduced this terminology, see this [[https://twitter.com/RichardSocher/status/1266500344055394304|tweet]] and these [[https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/slides/cs224n-2019-lecture17-multitask.pdf|slides]]. ===== Papers ===== * [[https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf|Radford et al 2018 - Language Models are Unsupervised Multitask Learners]] * [[https://arxiv.org/pdf/1806.08730.pdf|McCann et al 2018 - The Natural Language Decathlon: Multitask Learning as Question Answering]] ([[https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/slides/cs224n-2019-lecture17-multitask.pdf|slides]], talks about supertasks) ===== Blogs, etc ===== * Richard Socher's [[https://twitter.com/RichardSocher/status/1266500344055394304|tweet]] regarding GPT-3: "There are 3 equivalent super tasks of NLP: Language models, dialogue systems and question answering. LMs have the most training data->win." {{media:supertasks-2.png}} ===== Related Pages ===== * [[Dialog]] * [[Language Model]] * [[Question Answering]]