>

AMD CEO: CPUs Equipped with Xilinx AI Engines in 2023

This article is part of TechXchange: Artificial intelligence on the edge.

AMD It said it plans to integrate Xilinx’s differentiated IP into future generations of CPUs, in an effort to better compete with Intel and NVIDIA in the data centers and even in the compact market.

During AMD’s second-quarter conference call with analysts, AMD CEO Lisa Su said the Santa Clara, California-based company will integrate Xilinx artificial intelligence (AI) engines across its lineup of CPUs to enhance Its ability to infer artificial intelligence. The first chipsets in the series are scheduled to be released in 2023.

The move is the latest step in AMD’s ambitions to become a more broad-based semiconductor giant, following a $35 billion deal to buy Xilinx, which became part of the company in January. Su said that Xilinx is helping to diversify its offering with a wide range of computing engines — those suitable for artificial intelligence, for example, or data center networks — and give it a long list of customers to sell to.

“We now have the best suite of high-performance adaptive computing engines in the industry, and we see opportunities to leverage our broad range of technologies to deliver even more powerful products,” she said.

smart engines

The AI Engines It’s already shipped in Xilinx’s Versal family of Adaptive Computing Acceleration Platforms (ACAPs) to take on the likes of Intel, Marvell and NVIDIA in markets like cloud and networking servers.

AI engines are also equipped for chips designed specifically to power real-time workloads such as image recognition in embedded and high-end devices, from automobiles and industrial and medical robots to Aeronautical and defense systems such as satellites.

The Versal series contains scalar motors (CPU arm cores), smart motors (AI accelerators and DSP blocks), as well as the same type of programmable logic at the heart of its FPGA systems. A programmable network on a chip (NoC) ties everything together on a system on a chip (SoC), with various hard cores for connectivity (PCIe and CCIX), networking (Ethernet), memory, security, and I/O.

Xilinx’s Vivadoo ml The software stack is used to reconfigure programmable logic in Versal chips, while . is used to reconfigure programmable logic in Versal chips Vitesse And Vitesse AI The development tools are intended to configure software to run on the Versal platform.

“We have this AI engine already rolled out into production in a number of embedded applications and high-end devices,” said Victor Bing, former CEO of Xilinx and head of AMD’s Adaptive and Embedded Computing business. “The architecture itself can be scaled and fed into the CPU product family.”

He said the combined company is building a unified software suite to help customers leverage the AI ​​prowess of Xilinx and AMD chips for inference and training in data centers and on the edge.

Su added that the Xilinx deal gives the semiconductor giant a “much broader range of offerings” in the market for AI hardware, specifically on the heuristics side, complementing the AI ​​acceleration that its CPUs and GPUs provide for data centers. “Artificial intelligence is a great opportunity for us.”

earn land

Business boom at AMD. It gained the performance advantage over Intel with newer generations of CPUs, in part by contracting production with TSMC, giving it access to processing technologies that were years before Intel fabs.

Overall, the company reported $5.9 billion in revenue in the second quarter, up 71% from the same quarter last year. Su noted that each of its businesses grew by a doubling rate in the last quarter.

But it has gained some of its luster to investors in recent years as it captured the market for CPUs and GPUs for PCs as well as video game consoles, including Microsoft’s Xbox Series X, Sony PlayStation 5, and Valve’s Steam Deck.

AMD is also gaining ground in the data center market, where Intel has long dominated. Industry analysts say it has been taking market share from Intel as its biggest competitor works to regain its chip-making prowess.

According to AMD, sales of its server processors more than doubled in eight of the past 10 quarters, highlighting increased demand for EPYC CPUs from its customers in the enterprise cloud and high performance computing (HPC) market.

The company’s sales to the cloud computing market also rose as tech giants such as Amazon, Google and Microsoft in the US and Alibaba and Baidu in China ramped up their investments in server hardware.

As Intel struggles to transition to more advanced technology nodes, AMD is throwing in its weight with TSMC and other third-party organizations to boost production of its more advanced chips for computers and data center. Silicon Valley is also preparing to roll out its new generation of CPUs, codenamed Genoa, later in the year. It hopes the upgrade will help it gain more market share from Intel.

The prospect of strong demand for server processors plus the addition of Xilinx has prompted the company to raise its full-year revenue growth forecast to 61%. It had previously forecast sales to rise 31% in 2022.

Su said the company’s core business still lies in central processors and graphics chips used in laptops, desktops, and the data center. But Xilinx’s families of programmable chips give it “a lot of levers to grow” now and into the future.

diversification plan

One of its broader strategies recently has been to bundle a wide variety of chips that can be tightly linked together to improve performance and energy efficiency for customers with a wide range of computing needs.

She explained, “What we see in terms of future growth is that there will be more customization around solutions for big customers whether it’s cloud companies, big telecoms companies or even edge opportunities.”

To pursue its diversification strategy, AMD is also buying emerging network chip company Pensando Systems in a $1.9 billion deal. The deal will add a suite of data processing units (DPUs) and a software suite that will give it a wide range of storage, security, and networking technologies suitable for a data center.

“The whole strategy behind AMD is to get the best computing engines and then put them together into solutions for specific end markets,” Su said. “I suspect [having] CPUs, GPUs, FPGAs, adaptive SoCs and then the DPUs that we add from Pensando give us a massive array of capabilities.”

Sue added, “Having all of these computing engines will allow us to fundamentally improve these solutions together.”

The company still faces questions about whether it can continue to compete with Intel and implement its strategy given ongoing supply chain disruption and capacity constraints at TSMC and other foundries.

AMD executives said it’s working through display challenges. “We’re working with AMD’s larger scale to try to bring in more supplies and continue to increase overall capacity to support the next few very strong quarters,” said Su.

This article is part of TechXchange: Artificial intelligence on the edge.