After powering its own machine learning and AI development, Google’s creation Tensor Processing Unit (TPU) will now be available for third-party users through its cloud computing service.
Google announced that its proprietary Cloud TPU was available in beta version to the public in limited quantities on its Google Cloud Platform (GCP). The chip that was announced by google in 2016 was meant specifically for the purpose of accelerating the process of training machine learning models.
The cutting-edge Cloud TPUs can accelerate ML workloads with TensorFlow. A Cloud TPU can provide around 180 teraflops of computing power along with an ultra-high-bandwidth memory of 64 gigabytes. The TPUs provide access to RetinaNet, ResNet-50 and other models meant for image and object classification, and Transformer, Google’s neural network architecture for language understanding.The TPUs are being made available at the price of $6.50 per Cloud TPU per hour.
This may serve as an opportunity for Google to build a business around its cloud computing services and compete with the likes of Amazon Web Services (AWS). Google is one of the frontrunners in trend of developing chips meant exclusively for artificial intelligence. But it is not the only player in this space.
Apple already has it own chip, A11 Bionic, which they have used in iPhoneX. It was reported recently that Amazon is developing an AI chip that boost the performance of its virtual assistant Alexa and will see use in devices like Echo. Reportedly, Samsung has also almost finished developing its own neural processing units (NPU) or AI chips, to improve the ability of artificial intelligence software for its mobile phones and servers.
This gradual trend is turning companies from mere providers of internet or cloud services to developers of proprietary hardware. Also, their foray into hardware development sees them turning from consumers to competitors of established hardware providers like Intel and NVIDIA.
Try deep learning using MATLAB