AI & RoboticsNews

Intel on why orgs are stalling in their AI efforts — and how to gun the engine

Right now, 60% of the global GDP is digital, according to the World Economic Forum — and 80 zettabytes worth of data is going to be generated in 2022 alone. That’s going to grow to 180 zettabytes worth of data by 2025 and 2026. Driven by digitization, industries around the world are on the front edge of an era of sustained growth, if they can unlock the secret to turning infinite amounts of data into actionable insights.

The democratization of data is key, Kavitha Prasad, VP & GM datacenter, AI and cloud execution and strategy at Intel told VB CEO Matt Marshall at Transform on Tuesday.

“It’s not easy to get the insights with so much data by using traditional methods,” Prasad said. “AI is going to change that. For that to happen, we need to invest in AI today, both in people and in technology.”

While the rate of AI innovation is growing exponentially, AI is still in its early stages of deployment. Analyst reports find that 80% of businesses might be investing in AI, but only 20% of them are actually reaping the benefits. And even where AI is deployed broadly, it’s in places where the consequences of failure are minimal. Traditional ML and probabilistic methods and other intelligence have existed for as long as data has — but with the rate at which the data is growing, these traditional methods need to be augmented with advanced technologies like deep learning to reach the necessary business outcomes.

“A lot of the world today is focused on training these large-scale models, and less so on deploying it in production environments,” Prasad said. “If we’re really talking about democratizing AI, we need to see that 80% of the remaining use cases — the brick and mortar stores, the systems on the roads, the telco infrastructure — everything needs to be reaping the benefits of AI.”

There are several challenges to recouping the benefits from investment. One  comes from the fact that many companies forget that first, AI is a software problem — you need to cater to the demands of the developers and ecosystem, and not just focus on the pure performance per watt or pure power of your hardware. Add to this that AI in itself is a continuous and iterative process. Naturally, most organizations start with training. But when it comes to deploying these models, the whole process needs to be holistic, from gathering the data to training the model, testing it, deploying it, and then maintaining and monitoring it.

Cloud companies leverage MLOps, making it relatively easy to deploy, maintain and monitor. But edge and hybrid companies have to develop in the cloud and deploy on the edge — but then monitor on the edge and maintain back in the cloud. What you have trained your data against, versus what your ground truth is, can be completely different. And without maintenance, models start decaying and performance deteriorates.

The third piece of the puzzle is the data: accessibility, quality and sharing. From privacy issues to the garbage-in-garbage-out conundrum — where biased data becomes a liability to security and explainability — managing data can become a tremendous undertaking.

“It’s all these processes that need to come together for us to actually go deploy AI meaningfully in the industry,” Prasad said. “That’s when we can say we’re closer to democratizing AI.”

It’s essential to start working through these challenges as soon as possible, in order to stay abreast of the inexorable changes bearing down on the world. But unless you want to run into issues down the road, you need to take a moment to think through the system and architecture you already have in place. You need to consider the cost and complexity that opening your doors to AI and data will bring, from moving workloads to scaling your projects.

More companies than ever offer point solutions for every permutation of AI use case. But from an enterprise perspective, the key is determining how to put all these disparate pieces together, how to manage your data at every step of the process, how to ensure it is high quality and secure, and how to deploy your model — or in other words, how to combine your predictive analytics with your business acumen to get meaningful results.

And while AI is a software problem, hardware is still an essential piece of the puzzle. Intel is addressing the compute needs with AI embedded into its FPGAs, GPUs and CPUs. The company is also focusing on homogenizing the hardware with the software to offer customers a solid foundation for their AI efforts.

From a security perspective, it’s also investing in technologies like federated learning and homomorphic encryption: case in point, its partnership with the University of Pennsylvania in the largest federated learning use case deployed to that date, running on 25,000 MRI scans across seven continents to detect a rare brain disease that happens in one out of 100,000 people, all while preserving the privacy of the medical data used in the project.

“We work with partners and disruptor programs to make sure we bring the ecosystem along with us, to make sure there are continuous advancements in AI,” she said.

Get more insight into launching an AI strategy — including how companies are adding intelligence everywhere from the data center, through the networks, to the edge and out to consumer devices –by registering for a free virtual Transform pass right here.


Author: VB Staff
Source: Venturebeat

Related posts
AI & RoboticsNews

The show’s not over: 2024 sees big boost to AI investment

AI & RoboticsNews

AI on your smartphone? Hugging Face’s SmolLM2 brings powerful models to the palm of your hand

AI & RoboticsNews

Why multi-agent AI tackles complexities LLMs can’t

DefenseNews

US Army buys long-flying solar drones to watch over Pacific units

Sign up for our Newsletter and
stay informed!