Microsoft is rolling out custom chips that improve data center security and energy efficiency

Microsoft is rolling out custom chips that improve data center security and energy efficiency


At today’s Ignite developer conference titled: Microsoft announced two latest chips designed for data center infrastructure: the integrated Azure HSM and the Azure Boost DPU.

Scheduled for release in the coming months, these specially designed chips are intended to fill security and performance gaps in existing data centers, further optimizing their servers for large-scale AI workloads. The announcement follows the launch of Microsoft’s Maia AI accelerators and Cobalt processors, marking one other vital step in the company’s comprehensive technique to rethink and optimize every layer of the stack – from silicon to software – to support advanced AI.

- Advertisement -

The Satya Nadella-led company also detailed latest approaches to managing energy consumption and heat emissions in data centers, as many proceed to boost alarm over the environmental impact of AI-enabled data centers.

Recently, Goldman Sachs published research estimate that advanced AI workloads will increase energy demand in data centers by 160% by 2030, with data center facilities expected to devour 3-4% of world energy by the end of the decade.

New chips

By continuing to make use of industry-leading hardware from firms like Nvidia and AMD, Microsoft is pushing the envelope with its custom chips.

Last 12 months at Ignite, the company made headlines with its Azure Maia AI accelerator, optimized for AI and generative AI workloads, and the Azure Cobalt CPU, an ARM-based processor geared to run general-purpose computing workloads in the Microsoft Cloud.

Now, as the next step in this journey, it has expanded its custom silicon offerings with a special focus on safety and performance.

The latest internal security chip, Azure Integrated HSM, features a dedicated hardware security module designed to satisfy FIPS 140-3 Level 3 security standards.

According to Omar Khan, vice chairman of Azure infrastructure marketing, the module fundamentally improves key management to maintain encryption and signing keys secure inside the boundaries of the chip, without compromising performance or increasing latency.

To achieve this, Azure’s integrated HSM uses specialized hardware cryptographic accelerators that enable secure and efficient cryptographic operations directly in a physically isolated on-chip environment. Unlike traditional HSM architectures that require network connections or key extraction, the chip performs encryption, decryption, signing and verification operations entirely inside the limits of its dedicated hardware.

While the integrated HSM paves the way for higher data protection, Azure Boost DPU (data processing unit) optimizes data centers for highly multiplexed data streams corresponding to hundreds of thousands of network connections, focusing on energy efficiency.

Azure Boost DPU, Microsoft's new internal data processing engine
Azure Boost DPU, Microsoft’s latest internal data processing engine

Microsoft’s offering, a first in this category, complements CPUs and GPUs by combining many of the components of a traditional server on a single piece of silicon – from fast Ethernet and PCIe interfaces to networking and storage engines, data accelerators and security features.

It works with a sophisticated hardware-software joint design where a custom, lightweight operating system for data flow delivers higher performance, lower power consumption, and greater efficiency in comparison with traditional implementations.

Microsoft expects the chip to simply handle cloud storage workloads with three times less power and 4 times the performance in comparison with existing processor-based servers.

A brand new approach to cooling, power optimization

In addition to the latest chips, Microsoft also shared progress in improving data center cooling and optimizing their power consumption.

For cooling, the company announced an advanced version of its heat exchanger assembly – a rack that supports liquid cooling. He didn’t share the specific advantages promised by the technology, but noted that it could possibly be installed in Azure data centers to administer heat emissions from large-scale AI systems using AI accelerators and power-hungry GPUs, resembling those from Nvidia.

Liquid-cooled heat exchanger assembly for efficient cooling of large-scale AI systems

On the power management front, the company said it has collaborated with Meta on a latest disaggregated power rack that goals to extend flexibility and scalability.

“Each disaggregated power rack will be powered by 400V DC, enabling up to 35% more AI accelerators in each server rack, enabling dynamic power adjustments to meet varying AI workload requirements,” Khan wrote in a blog post.

Microsoft open-sources cooling and power rack specifications to the industry through the Open Compute Project. In terms of recent chips, the company said it plans to put in integrated Azure HSMs in every latest data center server starting next 12 months. However, the DPU implementation timeline stays unclear at this stage.

Microsoft Ignite runs from November 19 to 22, 2024

Latest Posts

Advertisement

More from this stream

Recomended