Red Hat Summit 2025: Execs Tout Opportunities In Open-Source AI, Virtualization Migration

‘AI can unlock human and business potential the same way open source did,’ says Red Hat CEO Matt Hicks.

Executives with IBM’s open-source enterprise tools subsidiary Red Hat said that publicly available collaborative projects will help with the adoption of artificial intelligence while also touting the vendor’s reputation as a virtualization alternative for VMware users, an opportunity Red Hat solution providers play a role in.

The executives shared this vision during keynote speeches at Red Hat’s annual Summit conference, which runs through Thursday in Boston.

Matt Hicks (pictured), CEO of Raleigh, N.C.-based Red Hat, used his Red Hat Summit 2025 keynote to show AI as another step in open source’s promise to democratize technology despite a user’s budget, skill level and location.

“Open source at its core removed barriers,” Hicks said. “It just unlocked human potential worldwide. … AI can unlock human and business potential the same way open source did. Because, at its core, AI removes barriers.”

[RELATED: Red Hat Launches RHEL 10 With New Capabilities For Hybrid Cloud And AI Systems]

Red Hat Summit 2025

About 80 percent of Red Hat’s revenue comes from channel and alliance partners, according to CRN’s 2025 Channel Chiefs. The vendor’s channel goals this year include increasing the overall percentage of company revenue that comes through the channel, enabling partners to develop an AI strategy and sell AI solutions plus improving partner profitability.

Hicks said the company’s portfolio can help users build a bridge to the AI era without abandoning existing applications, instead leveraging open-source AI tools to improve deployment and maintenance and free up employees for more complex work.

“While we might be in this moment of uncertainty between worlds, it’s up to us with that same bold spirit and principles that realized open source to start building these bridges,” he said.

Red Hat’s Virtualization Opportunity

Other Red Hat executives spoke of the opportunity for leaving legacy virtualization products for OpenShift—although the only reference to VMware by Broadcom by name during the keynote came from Phil Guido, executive vice president and chief commercial officer for chipmaker AMD, who said that moving from legacy VMware to modern AMD Red Hat OpenShift can yield operational expenditure savings of up to 77 percent and reduce energy consumption and power more than 71 percent.

Ashesh Badani, Red Hat senior vice president and chief product officer, told the Summit 2025 crowd that demand for the company’s virtualization and hybrid cloud wares “has been overwhelming.” OpenShift Virtualization has seen almost triple the number of customers, with the number of clusters deployed in production more than doubling and the number of virtual machines managed by the offer more than tripling.

“Last year, whether you liked it or not, most of you were given a virtualization price increase that you didn’t ask for,” Badani said. “You faced a virtualization future that became very uncertain. We told you that Red Hat wanted to be your virtualization and hybrid cloud provider for the future. … We’re reaching every corner of the world.”

Badani laid out the virtualization-related improvements in OpenShift 4.18 meant to meet the moment, including better networking, storage migration and VM management.

“The rate and pace and change of technology is absolutely amazing and unrelenting,” he said. “But one characteristic that’s remained consistent is that the open solution has ultimately prevailed. With operating systems in Linux, open won. With containers, Kubernetes and hybrid cloud, open won. And with virtualization and AI, we’re on track to have open win. When open wins, we all win. The open foundation we’ve laid for the hybrid cloud will be the underpinning of openness for AI.”

Red Hat’s ecosystem of partners is critical in helping customers deploy full virtualization architecture that is sustainable for the future, Stefanie Chiras, Red Hat’s channel chief—whose full title is senior vice president for partner ecosystem success—told the crowd during her keynote.

“OpenShift Virtualization brings the flexibility of running both virtual machines and containers so you can modernize at your rate and pace without changing the platform,” Chiras said. “Our partner ecosystem brings flexibility to connect those VMs and containers to data, the edge, the rest of your data center, and to clouds in a very secure and resilient way.”

Some of Red Hat’s partner news from Summit 2025 included a public preview of Red Hat OpenShift Virtualization on Microsoft Azure Red Hat OpenShift.

“OpenShift Virtualization is now supported on-prem and other major cloud providers,” Chiras said. “We put you in control of your virtualization future.”

Open-Source AI Models, Inferencing

Open AI models including the DeepSeek model that rocked markets for showing a potentially more cost-efficient way to bring AI into production will continue to stay competitive, Chris Wright, Red Hat CTO and senior vice president of global engineering, said during his speech.

He also pointed out the role of Anthropic’s open Model Context Protocol (MCP) standard in bringing the AI agent era to life—with Microsoft and other tech giants revealing MCP support in recent days.

Wright said that enterprises will want freedom to select model sizes, align models to business data, choose where to run workloads and which hardware accelerators to employ without fear of vendor lock-in.

“Openness leads to flexibility, and flexibility leads to choice,” he said. “And that’s what enterprises want: choice. … While it feels like everything around you is changing, it’s always good to have something stable to stand on. Red Hat AI is the stable foundation for your enterprise AI.”

With 2025 as potentially the year of AI inferencing, Red Hat is helping lead the way with contributions to the vLLM open-source project, which now sees about 500,000 downloads a week, said Brian Stevens, Red Hat senior vice president and AI CTO and former CEO of Neural Magic, which Red Hat acquired in 2024.

During Summit 2025, Stevens introduced the LLM-D open-source project meant to take vLLM from single-server limitations to distribute inference at scale for production. Users can use Kubernetes to integrate inference capabilities into existing IT fabrics, he said.

Stevens also touted the vendor’s work on quantization of AI models, making them smaller while retaining accuracy. Those optimized models see about 1 million downloads per month.