User Tools

Site Tools


nlp:open_problems

This is an old revision of the document!


Open Problems

Partial list of open problems in NLP.

Machine Translation

Pre-training

  • Are there simpler, faster methods for contextualized representations than pre-training Transformers? Historically, complex methods have been invented before researchers find simpler ways to do a similar thing. For example, there were methods for pre-training word embeddings that did not scale well until Tomas Mikolov asked the question “Is there a more efficient way to do this?” and invented the skip-gram model (Mikolov 2013 - Efficient Estimation of Word Representations in Vector Space and follow-up work). It is an open question if we are in a similar situation with Transformer models and contextualized pre-training today.
nlp/open_problems.1610400353.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki