MITB Banner

Why The Future Of NLP Is In The Hands Of Tech Giants Like Google, Microsoft & Amazon

Share

In 1954, with the success of the Georgetown experiment in which the scientists used a machine to translate random sentences from Russian to English, the field of computational linguistics took giant strides towards building an intelligent machine capable of recognising and translating speech. These models were even used in translations during the Nuremberg trials.

Nonetheless, the future of machine translation was nowhere close to the forecast due to sluggish computational devices and scarcity of data to train on.

Today, after six decades, machines have transitioned from mechanical statistical models to electronic neural models which can perform complicated tasks like speech recognition and sentiment analysis with great accuracy.

Today machines with NLP have the capability to analyse a wide variety of data (documents, text, voice) and have a range of practical uses, the discipline encompasses multiple capabilities. These range from sentiment analysis to text categorization. NLP can scour documents and classify them by topic, even without a programmer defining in advance which topics to look for.

The field of NLP is innovating every other day thanks to the constant effort of tech giants like Google, Microsoft, Facebook and Amazon. One thing common with these tech giants is their willingness to open source their innovations. Their belief in accelerated innovation through transparency has started to see fruition in the form of diversified real world applications from homepods to chatbots.

Here’s a look at NLP roadmap of these mega-firms:

Google

Open-domain question answering (QA) is a benchmark task in natural language understanding (NLU).

Google’s AI researchers recently released a paper introducing  Natural Questions (NQ), a new dataset for QA research, along with methods for QA system evaluation.

In contrast to tasks where it is relatively easy to gather naturally occurring examples, the definition of a suitable QA task, and the development of a methodology for annotation and evaluation is challenging.

Also, there is the Bidirectional  Encoder Representations from Transformers or BERT, which was open sourced last year, offers a new ground to embattle the intricacies involved in understanding the language models.

Pre-training a binarised prediction model helps understanding common NLP tasks like Question Answering or Natural language Inference.

Along with innovation in its own backyard, Google is also backing NLP startups like Armorblox. A cybersecurity startup, Armorblox aims to tackle data leaks via online attacks like email spear phishing.

Armorblox developed an NLP engine that derives insights from enterprise communications and data. It offers policy recommendations by learning over time what’s mission-critical for a given organization, and in the event of a potential or attempted breach, it automatically sends alerts to the relevant people and teams.

By applying NLU, Armorblox is able to address a whole new layer of security that has been inaccessible to other security solutions: the content and context of communications. This has been the biggest challenge and attack vector because hackers know that they can exploit this weakness.

Facebook

Engineers developing NLP algorithms often turn to deep-learning systems to build their solutions, such as Facebook’s PyTorch platform.

Facebook AI Research is open-sourcing PyText, a natural-language-processing (NLP) modelling framework that is used in the Portal video-calling device and M Suggestions in Facebook Messenger.

PyText builds on top of PyTorch by providing a set of interfaces and models specifically tuned for NLP. Internally, Facebook is using PyText to power NLP in their Portal video-calling device.

PyText addresses a common problem for NLP projects: the tradeoff between rapid experimentation and scalability in production. Researchers experiment with new ideas, rapidly tweaking models to achieve performance goals.

PyText can utilize multiple GPUs for distributed training and can train multiple models at once, reducing the overall training time. The PyText code also comes with pre-trained models for several common NLP tasks, including text classification, named-entity recognition, and joint intent-determination and slot-filling, which is a staple of chatbot development.

Microsoft

Microsoft’s NLP group focuses on developing efficient algorithms to process text and to make their information accessible to computer applications.

This group addresses natural language problems using a mix of knowledge-engineered and statistical/machine-learning techniques to disambiguate and respond to natural language input.

For example, the grammar checkers in Microsoft Office for English, French, German, and Spanish are some of the byproduct of Microsoft’s NLP enhancements.

  • Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text and includes four main functions: sentiment analysis, key phrase extraction, language detection, and entity linking.
  • Language Understanding (LUIS): A machine learning-based service to build natural language into apps, bots, and IoT devices. Quickly create enterprise-ready, custom models that continuously improve.

Designed to identify valuable information in conversations, LUIS interprets user goals (intents) and distils valuable information from sentences (entities), for a high quality, nuanced language model. LUIS integrates seamlessly with the Azure Bot Service, making it easy to create a sophisticated bot.

Under the hood of LUIS:

{
  “query”: “Book me a flight to Cairo”,
  “topScoringIntent”: {
      “intent”: “BookFlight”,
      “score”: 0.9887482
  },
  “intents”: [
      {
          “intent”: “BookFlight”,
          “score”: 0.9887482
      }

Microsoft also had presented a novel, fully data-driven, and knowledge-grounded neural conversation model last year, aimed at producing more contentful responses.

Amazon

Amazon Comprehend is a natural language processing (NLP) service that uses machine learning to find insights and relationships in text. No machine learning experience required.

The service identifies the language of the text; extracts key phrases, places, people, brands, or events; understands how positive or negative the text is; analyzes text using tokenization and parts of speech, and automatically organizes a collection of text files by topic.

Amazon Comprehend Medical, a variant of Comprehend, identifies the relationship among the extracted medication and test, treatment and procedure information for easier analysis. For example, the service identifies a particular dosage, strength, and frequency related to a specific medication from unstructured clinical notes.

Amazon too has backed NLP startups like its contemporaries. US-based mobile marketing firm Vibes has launched a tool called Conversational Analytics, which uses natural language processing (NLP) to extract consumer insights from unstructured messaging content.

Powered by Amazon Comprehend, this new tool uses machine learning to find insights and relationships in the messaging content exchanged between brands and consumers, and provide a more comprehensive understanding of the purchasing journey.

The Potency Of NLP

The potential for governments to employ these advancements in language processing defies limitation. Governments collect massive numbers of unstructured records. Fragmented information presented in deeply non-mathematical formats, at volumes far too huge for human assessment, are now available for the kind of deep-pattern analysis originally reserved for numerical databases. NLP can open the public sector to new insights, better-tailored services, and faster responses to information.

According to an online financial portal, the artificial intelligence in supply chain market alone is expected to reach USD 10,110.2 million by 2025 from USD 730.6 billion in 2018, at a CAGR of 45.55% during the forecast period.

Market growth can be attributed to the increasing adoption of deep learning and NLP technologies for automotive, retail, and manufacturing applications in APAC.

Some of the key players in this space are Intel (US), NVIDIA (US), Xilinx (US), Samsung Electronics (South Korea), Micron Technology (US), IBM (US), Microsoft (US), and Amazon (US).

The biggest challenge for NLP models, however, has been the lack of training data.

Small training sets restrict many NLP models from performing real-time rendering of both contextual and free from context tasks.

The next big challenge for these NLP models is to reach a human-level understanding of language which has been in the pursuit since the times of Leibniz and Descartes.

PS: The story was written using a keyboard.
Share
Picture of Ram Sagar

Ram Sagar

I have a master's degree in Robotics and I write about machine learning advancements.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India