AI & RoboticsNews

Cloud, edge or on-prem? Navigating the new AI infrastructure paradigm

No doubt, enterprise data infrastructure continues to transform with technological innovation — most notably today due to data-and-resource hungry generative AI.

As gen AI changes the enterprise itself, leaders continue to grapple with the cloud/edge/on-prem question. On the one hand, they need near-instant access to data; on the other, they need to know that that data is protected.

As they face this conundrum, more and more enterprises are seeing hybrid models as the way forward, as they can exploit the different advantages of what cloud, edge and on-prem models have to offer. Case in point: 85% of cloud buyers are either deployed or in the process of deploying a hybrid cloud, according to IDC.

“The pendulum between the edge and the cloud and all the hybrid flavors in between has kept shifting over the past decade,” Priyanka Tembey, co-founder and CTO at runtime application security company Operant, told VentureBeat. “There are quite a few use cases coming up where compute can benefit from running closer to the edge, or as a combination of edge plus cloud in a hybrid manner.”

Don’t miss our special issue: Fit for Purpose: Tailoring AI Infrastructure.

The shifting data infrastructure pendulum

For a long time, cloud was associated with hyperscale data centers — but that is no longer the case, explained Dave McCarthy, research VP and global research lead for IDC’s cloud and edge services. “Organizations are realizing that the cloud is an operating model that can be deployed anywhere,” he said.

“Cloud has been around long enough that it is time for customers to rethink their architectures,” he said. “This is opening the door for new ways of leveraging hybrid cloud and edge computing to maximize the value of AI.”

AI, notably, is driving the shift to hybrid cloud and edge because models need more and more computational power as well as access to large datasets, noted Miguel Leon, senior director at app modernization company WinWire.

“The combination of hybrid cloud, edge computing and AI is changing the tech landscape in a big way,” he told VentureBeat. “As AI continues to evolve and becomes a de facto embedded technology to all businesses, its ties with hybrid cloud and edge computing will only get deeper and deeper.”

Edge addresses issues cloud can’t alone

According to IDC research, spending on edge is expected to reach $232 billion this year. This growth can be attributed to several factors, McCarthy noted — each of which addresses a problem that cloud computing can’t solve alone.

One of the most significant is latency-sensitive applications. “Whether introduced by the network or the number of hops between the endpoint and server, latency represents a delay,” McCarthy explained. For instance, vision-based quality inspection systems used in manufacturing require real-time response to activity on a production line. “This is a situation where milliseconds matter, necessitating a local, edge-based system,” he said.

“Edge computing processes data closer to where it’s generated, reducing latency and making businesses more agile,” Leon agreed. It also supports AI apps that need fast data processing for tasks like image recognition and predictive maintenance.

Edge is beneficial for limited connectivity environments, as well, such as internet of things (IoT) devices that may be mobile and move in and out of coverage areas or experience limited bandwidth, McCarthy noted. In certain cases — autonomous vehicles, for one — AI must be operational even if a network is unavailable.

Another issue that spans all computing environments is data — and lots of it. According to the latest estimates, approximately 328.77 million terabytes of data are generated every day. By 2025, the volume of data is expected to increase to more than 170 zettabytes, representing a more than 145-fold increase in 15 years.

As data in remote locations continues to increase, costs associated with transmitting it to a central data store also continue to grow, McCarthy pointed out. However, in the case of predictive AI, most inference data does not need to be stored long-term. “An edge computing system can determine what data is necessary to keep,” he said.

Also, whether due to government regulation or corporate governance, there can be restrictions to where data can reside, McCarthy noted. As governments continue to pursue data sovereignty legislation, businesses are increasingly challenged with compliance. This can occur when cloud or data center infrastructure is located outside a local jurisdiction. Edge can come in handy here, as well,

With AI initiatives quickly moving from proof-of-concept trials to production deployments, scalability has become another big issue.

“The influx of data can overwhelm core infrastructure,” said McCarthy. He explained that, in the early days of the internet, content delivery networks (CDNs) were created to cache content closer to users. “Edge computing will do the same for AI,” he said.

Benefits and uses of hybrid models

Different cloud environments have different benefits, of course. For example, McCarthy noted, that auto-scaling to meet peak usage demands is “perfect” for public cloud. Meanwhile, on-premises data centers and private cloud environments can help secure and provide better control over proprietary data. The edge, for its part, provides resiliency and performance in the field. Each plays its part in an enterprise’s overall architecture.

“The benefit of a hybrid cloud is that it allows you to choose the right tool for the job,” said McCarthy.

He pointed to numerous use cases for hybrid models: For instance, in financial services, mainframe systems can be integrated with cloud environments so that institutions can maintain their own data centers for banking operations while leveraging the cloud for web and mobile-based customer access. Meanwhile, in retail, local in-store systems can continue to process point-of-sale transactions and inventory management independently of the cloud should an outage occur.

“This will become even more important as these retailers roll out AI systems to track customer behavior and prevent shrinkage,” said McCarthy.

Tembey also pointed out that a hybrid approach with a combination of AI that runs locally on a device, at the edge and in larger private or public models using strict isolation techniques can preserve sensitive data.

Not to say that there aren’t downsides — McCarthy pointed out that, for instance, hybrid can increase management complexity, especially in mixed vendor environments.

“That is one reason why cloud providers have been extending their platforms to both on-prem and edge locations,” he said, adding that original equipment manufacturers (OEMs) and independent software vendors (ISVs) have also increasingly been integrating with cloud providers.

Interestingly, at the same time, 80% of respondents to an IDC survey indicated that they either have or plan to move some public cloud resources back on-prem.

“For a while, cloud providers tried to convince customers that on-premises data centers would go away and everything would run in the hyperscale cloud,” McCarthy noted. “That has proven not to be the case.”


Author: Taryn Plumb
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
Cleantech & EV'sNews

Tesla progresses on Giga Berlin expansion, workers snuck in the middle of the night with police

Cleantech & EV'sNews

Stellantis-backed Leapmotor teases new B10 SUV before it launches in EU markets

Cleantech & EV'sNews

Rare earth element recycler Cyclic Materials to expand after $53M Series B funding round

CryptoNews

BNY Mellon Engages With Banking Regulators to Offer Crypto Custody Services 'at Scale'

Sign up for our Newsletter and
stay informed!

Share Your Thoughts!

This site uses Akismet to reduce spam. Learn how your comment data is processed.