Facebook is at the forefront of general machine learning and deep learning research. They have covered a lot of ground in tackling fake news, filtering offensive content through machine learning and artificial intelligence. The social media giant has been focussing on resolving issues and building products related to recommendation engines, text and image analysis. With deep learning pioneer Yann LeCun at the helm, FAIR has done substantial work in computer vision and coreML, which built state-of-the-art text understanding algorithms. These algorithms are integrated into the ML platform to facilitate and scale ML from training to model deployment.
FB Learner Flow: FB Learner Flow has been dubbed as AI’s backbone. This platform can reuse algorithms in different products, scale to run thousands of simultaneous custom experiments, and manage them with ease. The platform also provides functionality such as automatic generation of user interface experiences from pipeline definitions and automatic parallelisation of Python code using futures. FBLearner Flow is used by more than 25 percent of Facebook’s engineering team. Since it was first developed, more than a million models have been trained, and their prediction service has grown to make more than 6 million predictions per second. One of the key highlights of the program is that it eliminates time spent on feature engineering and it allows every engineer to run a lot of experiments. Flow can run simulations of 300,000 machine learning models every month.
Building Perception: Building on convolutional neural network, pioneered by LeCun, the neural networks in Building Perception are trained to understand a range of data better — photos, videos and even voices. Here’s a demo that can identify attributes in a photo scene and can segment and label the objects.
Facial Recognition with DeepFace: Facebook’s DeepFace application allowed it to recognise people in photos with a startling 97 percent accuracy. Trained on the largest facial dataset, the application can identify labelled dataset of four million facial images belonging to more than 4,000 identities, notes an FB research. In fact, the application also reached an accuracy of 97.35 percent on the Labelled Faces in the Wild (LFW) dataset, that reduces the error of the current state of the art by more than 27 percent, closely approaching human-level performance. This method upended the conventional pipeline of detect => align => represent => classify, and leveraged 3D face modelling to get a representation.
Facebook’s Text Understanding Engine DeepText: From general classification to determining a post’s purpose, and recognition of entities, Facebook’s DeepText application builds on DeepText the deep learning papers and leverages deep learning techniques to tackle language challenges that cannot be solved with traditional NLP techniques. DeepText uses a mathematical concept that can understand the semantic relationship between words and is useful for building models that are language-agnostic. One of the biggest advantages of DeepText is the automatic removal of spam. This technology is being used to better understand the user’s sentiments, and this has improved the text understanding system across other Facebook experiences.
Open Sourced its AI Hardware Design: Following the open source principles, the company open sourced its AI hardware design, pegged as the best in the world in 2015, reveals a Facebook post. Its Open Rack-compatible hardware designed for AI computing, known as Big Sur, was built in collaboration with partners and can work on networks which are twice as large. It also open sourced its Deep Learning module Torch, used for developing neural networks.
Bernard Marr, a well known AI influencer and big data expert, observed in a post that deep learning will continue to play a key role at Facebook because the company is keen to explore areas like drumming up audio descriptions alongside photos to aid the visually impaired. In the future, Facebook is planning several applications for underrepresented or poor countries. Last year, Facebook opened a new research lab in Montreal and also committed a $7 million funding support to University of Montreal, McGill University and Montreal Institute for Learning Algorithms. Facebook is already running labs in Paris, New York and Menlo Park. As part of its commitment to advance research in artificial intelligence, Facebook joined hands with Microsoft to launch Open Neural Network Exchange, defined as a step towards an open ecosystem which would help developers choose the right tool for their AI project. Essentially, the framework defines an extensible computation graph model, as well as definitions of built-in operators and standard data types.