How knowledge can be attained by computer systems to assist humans?
Continue ReadingBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
A new language representation model called BERT: Bidirectional Encoder Representations from Transformers.BERT is conceptually simple and empirically powerful. It obtains new state-of-the-art results on eleven natural language processing tasks
Continue ReadingAI superpowers
On how China caught AI fever and implemented government goals (with benchmarks) for 2020, 2025 in an attempt to become the world center of AI innovation by 2030.
Continue Reading