Facebook is planing to release PyTorch Mobile for deploying machine learning models on Android and iOS devices.
PyTorch Mobile was released today alongside PyTorch 1.3, the latest version of Facebook’s open-source deep learning library with quantization and support for use of Google Cloud TPUs, and tools like Captum, which supplies explainability for machine learning models.
PyTorch Mobile launches in experimental mode, like a public beta still in development that may change APIs or evolve in different ways before a general availability release.
“We fully expect this to be extended to support kind of generic Linux embedded platforms like Raspberry Pi that people are already deploying on today, but support [at launch] will be focused mainly on iOS and Android,” PyTorch product manager Joe Spisack told VentureBeat in a phone interview.
The release of PyTorch Mobile, which follows the earlier release of Caffe2go, will compete with offerings from other tech giants like Apple’s Core ML 3 and Google’s TensorFLow Lite 1.0.
A key difference between PyTorch Mobile and others including Caffe2go s that it can launch with its own runtime and was created with the assumption that anything you want to do on a mobile or edge device you may also want to do on a server, PyTorch product manager Joe Spisak told VentureBeat in a phone interview.
PyTorch Mobile was one of several newsworthy upgrades announced today for Facebook’s popular deep learning library. Also new today: PyTorch version 1.3 with quantization, named tensors for cleaner code, and the ability to use Google Cloud TPUs.
The Captum tool for explaining machine learning models and CrypTen for privacy-focused machine learning were also introduced today.
PyTorch 1.3 launches with quantization support, a feature key to mobile deployments. Google also brought quantization to TensorFlow Lite earlier this year.
Author: Khari Johnson
Source: Venturebeat