AI & RoboticsNews

The success of AI lies in the infrastructure

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


Artificial intelligence (AI) is bringing many changes to the enterprise, none of which is more vital to its success than infrastructure. Changing the nature of workloads – not just how they are generated and processed but how they apply to the operational goals – requires changes in the way raw data is handled, and this extends right down to the physical layer of the data stack.

As VB pointed out earlier this year, AI is already changing the way infrastructure is being designed all the way out to the edge. On a more fundamental level, basic hardware is becoming optimized to support AI workloads, and not just on the processor level. But it will take a coordinated effort, and no small amount of vision, to configure hardware to handle AI properly – and indeed, there isn’t likely to be one right way of doing it anyway.

Foundational change for AI infrastructure

In a recent survey of more than 2,000 business leaders by IDC, one of the lead findings was the growing realization that AI needs to reside on purpose-built infrastructure if it is to bring real value to the business model. In fact, lack of proper infrastructure was cited as one of the primary drivers for failed AI projects, which continues to stymie development in more than two-thirds of organizations. As with most technological initiatives, however, key hurdles to more AI-centric infrastructure include costs, lack of clear strategies and the sheer complexity of legacy data environments and infrastructure.

All hardware is interrelated in the enterprise, whether it sits in the data center, the cloud or the edge, and this makes it difficult to simply deploy new platforms and put them to work. But as tech author Tirthajyoti Sarkar points out, there are plenty of ways to gain real value from AI without the latest generation of optimized chip-level solutions entering the channel.

Cutting-edge GPUs, for example, may be the solution-of-choice for advanced deep learning and natural language processing models, but a number of AI applications – some of them quite advanced, such as game theoretics and large-scale reinforcement learning – are better-suited to the CPU. And since much of the heavy-lifting in AI development and utilization is typically performed by front-end data conditioning tools, choices over cores, acceleration technologies and cache may prove more consequential than the type of processor.

Memory architectures may also play a critical role in future AI platforms, says The Next Platform’s Jeffrey Burt. After all, even the fastest chip in the world is of little use if it can’t access data, and for that you need high capacity and high bandwidth in the memory module. To that end, research is turning toward AI-optimized memory solutions that could be used in tandem by CPUs, GPUs and even custom ASICs, along with their own on-chip memory cores. A key aspect of this development revolves around the open Compute Express Link (CXL) that enables coherency between multiple memory cores.

Application-specific AI

It also seems likely that infrastructure will not just be optimized around AI, but around all the different flavors of AI. For example, as NVIDIA notes, natural language processing (NLP) requires such massive compute power to handle vast amounts of data, so constructs like network fabrics and the advanced software used to manage them will prove vital. Again, the idea is not just to throw more raw power at AI but to streamline workflows and coordinate the activities of large, highly scaled and highly distributed data resources to ensure that projects can be completed on time and on budget.

None of this will happen overnight, of course. It took decades to bring data infrastructure to the state it’s in today, and it will take more years to tailor it to the needs of AI. But the incentive to achieve this is strong, and now that most enterprise infrastructure is being positioned by cloud providers as a core, revenue-generating asset, the need to be at the forefront of this transition will likely drive all new deployments going forward.

Despite all the changes that AI is poised to bring to the enterprise, one thing remains the same: AI is all about data, and data lives in infrastructure. The only way to ensure that AI’s promises can be turned into reality is to create the right physical underpinnings to allow the intelligence to work its magic.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.


Author: Arthur Cole
Source: Venturebeat

Related posts
Cleantech & EV'sNews

Einride deploys first daily commercial operations of autonomous trucks in Europe

Cleantech & EV'sNews

ChargePoint collaborates with GM Energy to deploy up to 500 EV fast chargers with Omni Ports

Cleantech & EV'sNews

How Ukraine assassinated a Russian general with an electric scooter

CryptoNews

Day-1 Crypto Executive Orders? Bitcoin Bulls Brace for Trump's Big Move

Sign up for our Newsletter and
stay informed!