Huawei this week announced that MindSpore, a framework for AI app development the company detailed in August 2019, is now available in open source on GitHub and Gitee. The lightweight suite is akin to Google’s TensorFlow and Facebook’s PyTorch, and it scales across devices, edge, and cloud environments, ostensibly lowering the barrier to entry for developers looking to imbue apps with AI.
MindSpore, which has the backing of partners like the University of Edinburgh, Peking University, Imperial College London, and robotics startup Milvus, runs atop processors, graphics cards, and dedicated neural processing units like those in Huawei’s Ascend AI chips. It results in 20% fewer lines of code than “leading” frameworks when dealing with typical natural language processing models, which Huawei claims confers a 50% efficiency boost on average. Moreover, it supports parallel training across hardware and dynamic debugging, enabling developers to isolate bugs while cutting down on model training time.
Somewhat uniquely, MindSpore doesn’t process any data itself but ingests only the gradient and model information that has already been processed. In this way, it preserves sensitive data in even “cross-scenario” environments while ensuring that models remain robust.
Complementing MindSpore is MindInsight, a module that provides debugging and tuning capabilities by producing visualizations of the training process, including computation graphs, training progress metrics, and model parameter information like training data and accuracy. Another module, called MindArmour, is intended to enhance model security and trustworthiness with submodules for adversarial example generation and detection, model defense, and model evaluation.
“MindSpore natively adapts to all scenarios,” said Huawei chief scientist Chen Lei in a statement. “We implement ‘AI algorithms as code’ through on-demand collaboration for easier model development, and [we] provide cutting-edge technologies and co-optimization with Huawei Ascend AI processors to improve runtime efficiency and computing performance.”
MindSpore requires Python 3.7+, and it’ll soon support languages like C++, Rust, and Julia, according to Huawei. It currently runs best on Linux distributions like Ubuntu and EulerOS.
Huawei also took the wraps off ModelArts Pro, an extension of its web-based ModelArts platform that provides full-pipeline services to customers, including data collection and model development.
ModelArts Pro — which supports applications like image classification, object detection, predictive analysis, and sound classification — automatically performs steps like model training, compression, and deployment based on the data it receives. It integrates a programming notebook (along with commonly used AI frameworks and software libraries) that lets users optionally create and debug models themselves, as well as services that streamline the process of bringing those models to the cloud or edge.
MindSpore’s debut comes after the launch of Huawei’s Ascend 910, a chipset in the company’s Ascend-Max family that’s optimized for AI model training, and the Ascend 310, an Ascend-Mini series inferencing chip designed to tackle tasks like image analysis, optical character recognition, and object recognition. The Ascend 910 is aimed principally at datacenter workloads, while the Ascend 310 targets internet-connected devices like smartphones, smartwatches, and other internet of things (IoT) products.
Author: Kyle Wiggers.
Source: Venturebeat