fbpx
NLP Gets A Surprise Addition As XLNet Outperforms BERT

Bidirectional  Encoder Representations from Transformers or BERT, which was open sourced late last year, offered a new ground to embattle the intricacies involved in understanding the language models.  BERT uses WordPiece embeddings with a 30,000 token vocabulary and learned positional…

ERNIE Gets What BERT Doesn’t – Making AI Smarter With Knowledge Graphs

Natural Language Processing has garnered great attention of late for two reasons — there is so much room for improvement and any chance of success being immensely rewarding. Neural networks, that are widely tasked with NLU, usually process language by…

ML Ecosystem Gets Mature With The Release Of PyTorch Hub

The rate at which machine learning enhancements get published has increased over the past couple of years. There are significant models like BERT for NLP tasks which are difficult to reproduce. While many of these publications are accompanied by code…

Google’s Move To Open Source BERT May Change NLP Forever

In 1954, with the success of the Georgetown experiment in which the scientists used a machine to translate random sentences from Russian to English, the field of computational linguistics took giant strides towards building an intelligent machine capable of recognising…

Over 100,000 people subscribe to our newsletter.

See stories of Analytics and AI in your inbox.