×
IBM unveils Power11 chips for faster AI inference with 30-second downtime
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

IBM has launched its new Power11 chips and servers, marking the company’s first major update to its Power processor line since 2020. The systems are designed to simplify AI deployment for businesses while offering enhanced power efficiency, security, and reliability compared to competitors like Intel and AMD.

What you should know: The Power11 systems target specialized sectors including financial services, manufacturing, and healthcare with integrated hardware and software packages.

  • Available starting July 25, the systems promise virtually no planned downtime for software updates and average just over 30 seconds of unplanned downtime annually.
  • The servers can detect and respond to ransomware attacks within one minute, addressing a critical security concern for enterprise customers.
  • IBM plans to integrate Power11 with its Spyre AI chip in Q4 2024, focusing specifically on AI inference rather than training capabilities.

The big picture: IBM is positioning itself as a simplified alternative to Nvidia’s AI infrastructure, emphasizing ease of deployment over raw computational power for AI training.

What they’re saying: Tom McPherson, general manager of Power systems at IBM, explained the company’s strategic focus on practical AI implementation.

  • “We can integrate AI capabilities seamlessly into this for inference acceleration and help their business process improvements,” McPherson said regarding work with early customers.
  • “It’s not going to have all the horsepower for training or anything, but it’s going to have really good inferencing capabilities that are simple to integrate.”

In plain English: AI inference is like using a calculator that’s already been programmed—you input data and get results for daily business tasks. AI training, by contrast, is like teaching that calculator how to do math in the first place, which requires massive computing power that IBM isn’t trying to compete with.

Why this matters: IBM’s approach targets the growing demand for AI inference workloads in enterprise environments, where businesses need to deploy trained AI models for daily operations rather than develop new AI systems from scratch.

Competitive landscape: Unlike Nvidia’s focus on AI training infrastructure, IBM’s Power systems compete directly with Intel and AMD in traditional data center markets while adding AI inference capabilities as a differentiator.

IBM rolls out new chips and servers, aims for simplified AI

Recent News

Perplexity launches Comet browser with built-in AI assistant

The AI-powered assistant lets users query webpages without switching tabs or sites.

FlexOlmo architecture lets data owners remove content from trained AI models

Data owners keep control without handing over their actual content to model builders.

Hugging Face launches $299 desktop robot to democratize AI development

Open-source hardware designs challenge an industry where traditional robots cost $70,000.