A new language representation model called BERT: Bidirectional Encoder Representations from Transformers.BERT is conceptually simple and empirically powerful. It obtains new state-of-the-art results on eleven natural language processing tasks
Continue ReadingSide-notes to share