23. Text models
• Transformer
• Universal-sentence-encoder-multilingual-qa for 16 languages
• Bidirectional Encoder Representations from Transformers
• DAN (Deep Average Network)
• Universal-sentence-encoder
• ELMo (Embeddings from Language Models)
• Deep contextualized word representations
• NNLM (feed-forward Neural-Net Language Models, trained on google
news 30B)
• Word2vec (trained on English Wikipedia corpus)