Intel has announced its Xeon 6 chip to ensure that data centers can handle the workloads needed as more companies start to deploy AI apps and models. The processor comes with two microarchitectures, an efficient core (E-core) and a performance core (P-core) and will be available in the 6700 and 6900 platform offerings. However, only the 6700 E-core version will launch on June 4. The 6900 P-core version will debut in Q3 2024, with others available in Q1 2025.
“It’s all about enabling customers in the industry to deliver real business outcomes,” Matt Langman, Intel’s vice president and general manager, said in a news conference. “We see every company becoming an AI company, whether these companies are looking to be more efficient in their operations, more efficient in their product development, more efficient—or just more effective—with their customer engagements.”
In addition to the Xeon 6 processor, Intel is revealing pricing for its Gaudi 2 and Gaudi 3 accelerator chips for the first time. A standard AI kit featuring the Gaudi 2, introduced in 2022, will cost $65,000, while the Gaudi 3, which was unveiled in April, will list at $125,000.
Intel calls the Xeon 6 a “robust computing platform” that “excels at both performance and efficiency,” two areas that it claims are crucial for “meeting the ever-growing demands of the data center.” The processor supports a wide array of use cases, from compute-intensive AI and high-performance computing to traditional enterprise applications and those that are power-efficient and high-density.
The plan is for companies to modernize their aging data center systems with Xeon 6 chips, which promises to increase cost savings, help meet sustainability goals, optimize physical floor and rack space, and generate new digital capabilities.
The Xeon 6 P-core and E-core, formerly code-named Granite Rapids and Sierra Forest, share a hardware platform foundation and software stack. They’re equipped to support an increased core count, have a larger memory bandwidth with DDR5, feature multiplexed combined rank DIMM, increased inter-socket bandwidth with UPI 2.0, Compute Express Link 2.0, Common OS and firmware, and more.
Citing the different requirements data centers might need, Intel offers two different microarchitectures for its Xeon 6. It boils down to workloads and capabilities. A P-core will likely be more appropriate for those handling high-performance computing and compute-intensive AI. However, it’s also useful for the “broadest array of enterprise applications.” Langman explains, “It really is some of these latency-sensitive workloads that the benefits of a P-core will help meet or exceed these usages for high single-threaded and high per-core performance capabilities.” However, workloads involving cloud-native web, microservices and digital services will perhaps benefit from Xeon 6 E-core, “taking advantage of the higher density and improved performance per watt.”
So, what are the differences between the Xeon 6 6700 Series and the 6900 Series?
Not every variation of the Xeon 6 will be available today. Intel says it is deliberately staggering the release of the different processor SKUs because it’s what customers apparently want. “The thing that made the most sense for the variety of customers we’re covering with Xeon 6 is the kind of staggered order you see. And what’s exciting is as we’re getting closer to launching the first one, it is in a great order, and we’re getting a lot of market excitement around it, and it’s fun watching it happen,” Ryan Tabrah, Intel’s vice president and general manager for its E-core product line, clarifies.
Because the Xeon 6 E-core is the first to debut—and the first Xeon version to have an E-core—Intel has provided some metrics, saying it enables rack-level consolidation of 3-to-1, provides rack-level performance gains of up to 4.2 times and provides performance per watt gains of up to 2.6 when compared to its second-generation Intel Xeon processors on media transcode workloads.
“As we look at today’s data center, and let’s say you take approximately 200 racks, which would be a typical mid-sized data center deployment, looking at about 15-kilowatt racks and 22 u servers, and you look at how they’re delivering media streams per second on second-gen Intel Xeon. You look at the advancements we’re delivering on Intel Xeon 6 with the perf per watt and overall perf performance improvement, and you get that 3-to-1 rack consolidation down to 66 racks—huge, amazing savings in rack space and rack capability,” Langman says.
“But…it’s not only the savings in racks, but it’s the energy savings and the carbon footprint. And what we see is, from a fleet energy usage, a savings of upwards of 84,000-megawatt hours over the period of four years—significant energy savings—combined with reducing carbon emissions of 34,000 metric tons…over that same period of time. So we get the benefits of rack consolidation and enabling the industry sustainability goals.”
Besides the Xeon 6 news, Intel is releasing pricing for two of its latest Gaudi products. Designed to compete against Nvidia’s H100, these accelerator chips help train and infer large language models but at a lower total operating cost.
Companies using a standard AI kit with eight Intel Gaudi 2 accelerators with a universal baseboard will pay $65,000, which Intel estimates to be one-third of the cost of comparable alternatives. A similar kit with eight Intel Gaudi 3 accelerators will be $125,000, or about two-thirds the price of Intel’s competitors.
Although Intel disclosed pricing, it did not provide any more details about Gaudi 3’s availability beyond stating that the chip is on track for release in Q3 2024.
To help with its Go-To-Market strategy, the company is partnering with at least ten system providers, including Dell, Hewlett-Packard Enterprise, Lenovo, Supermicro and new additions Asus, Foxconn, Gigabyte, Inventec, Quanta and Wistron.
When asked how Xeon and Gaudi will work together, Tabrah describes the relationship as complementary. “It’s unleashing customers’ data centers,” he states. “They’re running into these power bottlenecks. The data center itself is almost becoming the bottleneck. And if you can unleash that data center for yourself just by quickly and easily moving over to a very efficient general compute architecture and not have to touch anything, then you can go and just innovate with the rest of your infrastructure to do more AI; that’s just awesome.”
Anil Nanduri, another of Intel’s vice presidents, points out that it’s all about helping the enterprise unlock its data. With the introduction of generative AI, organizations are searching for the best ways to leverage large language models to create solutions suitable for the workloads they need and what’s best for the computational costs they’re comfortable investing in. “You’re going to see where accelerators will still run those [LLMs], but a RAG-like use case where I can keep my datasets more current, and I can get better outcomes for the customers who are doing knowledge discovery or other use cases, keep their vector embeddings on the Xeon and then connect it to a model that’s running on an accelerator,” he shares. “This is a great use case where we will expect to see a lot of good compatibility, performance and value Xeon and Gaudi can bring.”
“Intel is one of the only companies in the world innovating across the full spectrum of the AI market opportunity—from semiconductor manufacturing to PC, network, edge and data center systems,” Pat Gelsinger, Intel’s chief executive, remarks in a statement. “Our latest Xeon, Gaudi and Core Ultra platforms, combined with the power of our hardware and software ecosystem, is delivering the flexible, secure, sustainable and cost-effective solutions our customers need to maximize the immense opportunities ahead.”
Time’s almost up! There’s only one week left to request an invite to The AI Impact Tour on June 5th. Don’t miss out on this incredible opportunity to explore various methods for auditing AI models. Find out how you can attend here.
Intel has announced its Xeon 6 chip to ensure that data centers can handle the workloads needed as more companies start to deploy AI apps and models. The processor comes with two microarchitectures, an efficient core (E-core) and a performance core (P-core) and will be available in the 6700 and 6900 platform offerings. However, only the 6700 E-core version will launch on June 4. The 6900 P-core version will debut in Q3 2024, with others available in Q1 2025.
“It’s all about enabling customers in the industry to deliver real business outcomes,” Matt Langman, Intel’s vice president and general manager, said in a news conference. “We see every company becoming an AI company, whether these companies are looking to be more efficient in their operations, more efficient in their product development, more efficient—or just more effective—with their customer engagements.”
In addition to the Xeon 6 processor, Intel is revealing pricing for its Gaudi 2 and Gaudi 3 accelerator chips for the first time. A standard AI kit featuring the Gaudi 2, introduced in 2022, will cost $65,000, while the Gaudi 3, which was unveiled in April, will list at $125,000.
Xeon 6: What you need to know
Intel calls the Xeon 6 a “robust computing platform” that “excels at both performance and efficiency,” two areas that it claims are crucial for “meeting the ever-growing demands of the data center.” The processor supports a wide array of use cases, from compute-intensive AI and high-performance computing to traditional enterprise applications and those that are power-efficient and high-density.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure optimal performance and accuracy across your organization. Secure your attendance for this exclusive invite-only event.
The plan is for companies to modernize their aging data center systems with Xeon 6 chips, which promises to increase cost savings, help meet sustainability goals, optimize physical floor and rack space, and generate new digital capabilities.
The Xeon 6 P-core and E-core, formerly code-named Granite Rapids and Sierra Forest, share a hardware platform foundation and software stack. They’re equipped to support an increased core count, have a larger memory bandwidth with DDR5, feature multiplexed combined rank DIMM, increased inter-socket bandwidth with UPI 2.0, Compute Express Link 2.0, Common OS and firmware, and more.
Citing the different requirements data centers might need, Intel offers two different microarchitectures for its Xeon 6. It boils down to workloads and capabilities. A P-core will likely be more appropriate for those handling high-performance computing and compute-intensive AI. However, it’s also useful for the “broadest array of enterprise applications.” Langman explains, “It really is some of these latency-sensitive workloads that the benefits of a P-core will help meet or exceed these usages for high single-threaded and high per-core performance capabilities.” However, workloads involving cloud-native web, microservices and digital services will perhaps benefit from Xeon 6 E-core, “taking advantage of the higher density and improved performance per watt.”
So, what are the differences between the Xeon 6 6700 Series and the 6900 Series?
Xeon 6 Processors (6700 Series)
- Up to 144 Efficient-cores / 86 Performance-cores
- Socket Support: 1S/2S and 4S/8S (P-core only) support
- Max TDP: Up to 350W per CPU
- Mem Channels: 8 channel memory, up to 6400 MT/s DDR5 memory, 8000 MT/s MCR DIMM memory (P-core)
- PCIe/CXL: Up to 88 lanes PCIe 5.0/CXL 2.0
- UPI Links: 4 UPI 2.0 links, up to 24 GT/s
Xeon 6 Processors (6900 Series)
- Up to 288 Efficient-cores / 128 Performance-cores
- Socket Support: 1S/2S support
- Max TDP: Up to 500W per CPU
- Mem Channels: 12 channel memory, up to 6400 MT/s DDR5 memory, 8800 MT/s MCR DIMM memory (P-core)
- PCIe/CXL: Up to 96 lanes PCIe 5.0/CXL 2.0
- UPI Links: 6 UPI 2.0 links, up to 24 GT/s
What’s with the staggered release?
Not every variation of the Xeon 6 will be available today. Intel says it is deliberately staggering the release of the different processor SKUs because it’s what customers apparently want. “The thing that made the most sense for the variety of customers we’re covering with Xeon 6 is the kind of staggered order you see. And what’s exciting is as we’re getting closer to launching the first one, it is in a great order, and we’re getting a lot of market excitement around it, and it’s fun watching it happen,” Ryan Tabrah, Intel’s vice president and general manager for its E-core product line, clarifies.
Xeon 6 E-core bragging rights
Because the Xeon 6 E-core is the first to debut—and the first Xeon version to have an E-core—Intel has provided some metrics, saying it enables rack-level consolidation of 3-to-1, provides rack-level performance gains of up to 4.2 times and provides performance per watt gains of up to 2.6 when compared to its second-generation Intel Xeon processors on media transcode workloads.
“As we look at today’s data center, and let’s say you take approximately 200 racks, which would be a typical mid-sized data center deployment, looking at about 15-kilowatt racks and 22 u servers, and you look at how they’re delivering media streams per second on second-gen Intel Xeon. You look at the advancements we’re delivering on Intel Xeon 6 with the perf per watt and overall perf performance improvement, and you get that 3-to-1 rack consolidation down to 66 racks—huge, amazing savings in rack space and rack capability,” Langman says.
“But…it’s not only the savings in racks, but it’s the energy savings and the carbon footprint. And what we see is, from a fleet energy usage, a savings of upwards of 84,000-megawatt hours over the period of four years—significant energy savings—combined with reducing carbon emissions of 34,000 metric tons…over that same period of time. So we get the benefits of rack consolidation and enabling the industry sustainability goals.”
Gaudi 3: Pricing and new system providers
Besides the Xeon 6 news, Intel is releasing pricing for two of its latest Gaudi products. Designed to compete against Nvidia’s H100, these accelerator chips help train and infer large language models but at a lower total operating cost.
Companies using a standard AI kit with eight Intel Gaudi 2 accelerators with a universal baseboard will pay $65,000, which Intel estimates to be one-third of the cost of comparable alternatives. A similar kit with eight Intel Gaudi 3 accelerators will be $125,000, or about two-thirds the price of Intel’s competitors.
Although Intel disclosed pricing, it did not provide any more details about Gaudi 3’s availability beyond stating that the chip is on track for release in Q3 2024.
To help with its Go-To-Market strategy, the company is partnering with at least ten system providers, including Dell, Hewlett-Packard Enterprise, Lenovo, Supermicro and new additions Asus, Foxconn, Gigabyte, Inventec, Quanta and Wistron.
AI use cases for Xeon and Gaudi
When asked how Xeon and Gaudi will work together, Tabrah describes the relationship as complementary. “It’s unleashing customers’ data centers,” he states. “They’re running into these power bottlenecks. The data center itself is almost becoming the bottleneck. And if you can unleash that data center for yourself just by quickly and easily moving over to a very efficient general compute architecture and not have to touch anything, then you can go and just innovate with the rest of your infrastructure to do more AI; that’s just awesome.”
Anil Nanduri, another of Intel’s vice presidents, points out that it’s all about helping the enterprise unlock its data. With the introduction of generative AI, organizations are searching for the best ways to leverage large language models to create solutions suitable for the workloads they need and what’s best for the computational costs they’re comfortable investing in. “You’re going to see where accelerators will still run those [LLMs], but a RAG-like use case where I can keep my datasets more current, and I can get better outcomes for the customers who are doing knowledge discovery or other use cases, keep their vector embeddings on the Xeon and then connect it to a model that’s running on an accelerator,” he shares. “This is a great use case where we will expect to see a lot of good compatibility, performance and value Xeon and Gaudi can bring.”
“Intel is one of the only companies in the world innovating across the full spectrum of the AI market opportunity—from semiconductor manufacturing to PC, network, edge and data center systems,” Pat Gelsinger, Intel’s chief executive, remarks in a statement. “Our latest Xeon, Gaudi and Core Ultra platforms, combined with the power of our hardware and software ecosystem, is delivering the flexible, secure, sustainable and cost-effective solutions our customers need to maximize the immense opportunities ahead.”
Author: Ken Yeung
Source: Venturebeat
Reviewed By: Editorial Team