How Language Models Can Be Used In Real-Time Use Cases

Recent advancements in natural language processing (NLP) have touched many heights over the past few years. Pre-trained high-capacity language models such as ELMo and BERT have gained popularity in NLP.  Language modelling has been implementing in a number of applications…

BAIDU’s ERNIE 2.0 Gets NLP Top Honours, Eclipses BERT & XLNet 

Machine learning models which are deployed for vision and in natural language processing (NLP) tasks usually have more than one billion parameters. This allows for better results as the model generalises over a large wide range of parameters. Pre-trained language…

How Good Is BERT For Filling The Gap Between Accuracy Scores & Language Comprehension?

BERT has set a new benchmark for NLP tasks. And, this has been documented quite well over the past six months. Bidirectional Encoder Representations from Transformers or BERT, which was open sourced last year, offered a new ground to embattle…

How Can Memory Augmentation Work Wonders For Large Scale NLP Tasks

Current machine learning models that are deployed for vision and in natural language processing(NLP) tasks have more than a billion parameters. This allows for better results as the model generalizes over a large wide range of parameters. But there is…

NLP Gets A Surprise Addition As XLNet Outperforms BERT

Bidirectional  Encoder Representations from Transformers or BERT, which was open sourced late last year, offered a new ground to embattle the intricacies involved in understanding the language models.  BERT uses WordPiece embeddings with a 30,000 token vocabulary and learned positional…

ERNIE Gets What BERT Doesn’t – Making AI Smarter With Knowledge Graphs

Natural Language Processing has garnered great attention of late for two reasons — there is so much room for improvement and any chance of success being immensely rewarding. Neural networks, that are widely tasked with NLU, usually process language by…

ML Ecosystem Gets Mature With The Release Of PyTorch Hub

The rate at which machine learning enhancements get published has increased over the past couple of years. There are significant models like BERT for NLP tasks which are difficult to reproduce. While many of these publications are accompanied by code…

Google’s Move To Open Source BERT May Change NLP Forever

In 1954, with the success of the Georgetown experiment in which the scientists used a machine to translate random sentences from Russian to English, the field of computational linguistics took giant strides towards building an intelligent machine capable of recognising…

Over 100,000 people subscribe to our newsletter.

See stories of Analytics and AI in your inbox.