...

SUSE, Nvidia Launch AI Infra for Enterprise AI

Key Highlights:

Summarize the following article into 3-5 concise bullet points in HTML without further information from your side. format:

//php echo do_shortcode(‘(responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”)’) ?>

Germany-based open-source software provider SUSE has launched an AI infrastructure stack, in partnership with Nvidia, to help enterprises build, deploy, and scale AI workloads across data centers, edge, and cloud environments while maintaining control over data and infrastructure.The two offerings include SUSE AI Factory and SUSE AI Factory with Nvidia. The former provides an automated software stack built on SUSE Rancher Prime for deploying and managing AI workloads. The Nvidia-enabled version integrates Nvidia AI Enterprise components, including NIM microservices, Nemotron models, and Nvidia Run:ai for GPU orchestration, as part of the overall software stack.The company said the move was driven by growing enterprise demand for on-premises and hybrid AI infrastructure, as organizations move from proof of concept to production. Developers can build and test applications in sandbox environments, while platform teams manage deployment using a unified interface or GitOps workflows. The company said this reflects a broader shift toward controlled environments, as enterprises bring AI workloads closer to proprietary data.

“We identified a missing link in creating an assembly line for organizations to onboard, provision, manage, observe, and secure artificial intelligence infrastructure,” said Rhys Oxenham, VP and general manager of AI at SUSE. He said SUSE AI Factory is designed as a “turnkey digital factory” to address gaps in innovation, deployment location, operations, and collaboration between platform and AI engineering teams.

By Andrej Seb, Staff Engineer, Infineon Technologies   04.29.2026

By Shanghai Yongming Electronic Co.,Ltd  04.28.2026

By Rejoy Surendran, Market Strategy Manager & Xinpei Cao, Sr. Principal, Application Engineering, Henkel  04.27.2026

For Nvidia, which is expanding its presence in the enterprise data center, the partnership supports deployment across on-premises, hybrid, and cloud environments as customers transition to production AI systems.
“Enterprise adoption of AI is accelerating, creating demand for infrastructure that ensures data control and governance for regulated workloads,” said John Fanelli, VP for Software Enterprise at Nvidia. He said many organizations begin AI development in the cloud but transition to hybrid or on-premises deployments as they scale, particularly to retain control over data.John Fanelli, VP for Software Enterprise at Nvidia, and Thomas Di Giacomo, chief technology and product officer at SUSE (Source: SUSE)Simplifying deployment and operationsSUSE AI Factory includes pre-validated architectural blueprints for common workloads, allowing enterprises to build custom deployments using SUSE and Nvidia components. The platform incorporates zero-trust security and observability features and extends SUSE Rancher Prime and SUSE Linux Enterprise Server capabilities to AI workloads.Fanelli said the companies have jointly developed blueprints to simplify deployment. “We have been building blueprints on top to simplify things for customers,” he said.A unified interface supports deployment across environments, from local systems to air-gapped edge clusters, while maintaining lifecycle management consistency.SUSE said it provides a single point of support across the full stack, including Nvidia AI Enterprise components, while continuing to build an ecosystem of partners. “Customers want a complete solution,” said Dirk-Peter van Leeuwen, CEO of SUSE, adding that partnerships across hardware, software, and services are central to the company’s approach.Sovereignty, security, and open sourceSUSE has repeatedly emphasized open source as a foundation for sovereignty. The company is introducing a sovereignty specialization under its partner program to support the deployment of sovereign-compliant infrastructure through local and global partners. This includes an ecosystem approach combining SUSE’s open-source stack, managed service providers, and global system integrators to operationalize sovereignty requirements.Van Leeuwen described a move toward a third phase of AI adoption, in which companies integrate proprietary data into AI models and require sovereign infrastructure to support it.“Customers need to maintain control over their data and decision-making,” van Leeuwen said. He said enterprises are increasingly moving toward private and hybrid infrastructure as they deploy AI using their own data.This trend aligns with customer concerns around vendor lock-in and external dependencies. “If the ability to switch is not built into your architecture from the beginning, adapting later becomes difficult,” said Frank Feldman, chief strategy officer at SUSE.SUSE positions its AI Factory platform around digital sovereignty, integrating Nvidia technologies into its open-source-based infrastructure stack. The company said organizations can keep sensitive data and models within private infrastructure while using Nvidia components as part of the overall system.“AI developers, users, and operations teams are in a catch-22 with AI. They want to innovate quickly but must secure these workloads and processes to ensure full auditability before running them in production,” said Thomas Di Giacomo, chief technology and product officer at SUSE. He said SUSE focuses on delivering a supported, integrated stack while working with partners for specific technologies.“Sovereignty has become non-negotiable for many customers,” Di Giacomo said, adding that customers require control, transparency, and auditability, particularly in regulated environments.Addressing the mix of open-source and proprietary elements, Di Giacomo said Nvidia is increasingly embracing open source, and that SUSE relies on partners for components that are not open. He said the company focuses on integration, service-level agreements, and delivering a supported stack across the platform.Oxenham said most components within Nvidia AI Enterprise are open source, including key elements such as GPU drivers and operators, while some parts remain proprietary.SUSE said the platform integrates Nvidia technologies, including NIM microservices, Nemotron models, NeMo, Run:ai, Kubernetes Operators, OpenShell runtime, and NemoClaw, alongside SUSE’s K3s.Broader availability of SUSE AI Factory and SUSE AI Factory with Nvidia is expected later this year. Executives said the focus will remain on enabling enterprises to deploy AI factories within their own environments, particularly for regulated and data-sensitive workloads.See also:SUSE Extends Single-Kernel Linux Strategy from Edge to Data CenterSUSE Launches Industrial Edge Platform Following Losant Acquisition


Rewrite the following article in a natural, human-like tone. Keep the meaning the same but improve clarity, structure, and readability. Do NOT mention any source, website, or external reference. Return clean HTML paragraphs:

//php echo do_shortcode(‘(responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”)’) ?>

Germany-based open-source software provider SUSE has launched an AI infrastructure stack, in partnership with Nvidia, to help enterprises build, deploy, and scale AI workloads across data centers, edge, and cloud environments while maintaining control over data and infrastructure.

The two offerings include SUSE AI Factory and SUSE AI Factory with Nvidia. The former provides an automated software stack built on SUSE Rancher Prime for deploying and managing AI workloads. The Nvidia-enabled version integrates Nvidia AI Enterprise components, including NIM microservices, Nemotron models, and Nvidia Run:ai for GPU orchestration, as part of the overall software stack.

The company said the move was driven by growing enterprise demand for on-premises and hybrid AI infrastructure, as organizations move from proof of concept to production. Developers can build and test applications in sandbox environments, while platform teams manage deployment using a unified interface or GitOps workflows. The company said this reflects a broader shift toward controlled environments, as enterprises bring AI workloads closer to proprietary data.

“We identified a missing link in creating an assembly line for organizations to onboard, provision, manage, observe, and secure artificial intelligence infrastructure,” said Rhys Oxenham, VP and general manager of AI at SUSE. He said SUSE AI Factory is designed as a “turnkey digital factory” to address gaps in innovation, deployment location, operations, and collaboration between platform and AI engineering teams.

Pressure Sensors: Turning Environmental Signals into Smart Actions

By Andrej Seb, Staff Engineer, Infineon Technologies   04.29.2026

Input Capacitor Challenges in High-Density PD and GaN Chargers — YMIN Capacitor Solutions

By Shanghai Yongming Electronic Co.,Ltd  04.28.2026

Is Copper Sintering the Key to Advancing Wide Band Gap Semiconductors?

By Rejoy Surendran, Market Strategy Manager & Xinpei Cao, Sr. Principal, Application Engineering, Henkel  04.27.2026

For Nvidia, which is expanding its presence in the enterprise data center, the partnership supports deployment across on-premises, hybrid, and cloud environments as customers transition to production AI systems.

“Enterprise adoption of AI is accelerating, creating demand for infrastructure that ensures data control and governance for regulated workloads,” said John Fanelli, VP for Software Enterprise at Nvidia. He said many organizations begin AI development in the cloud but transition to hybrid or on-premises deployments as they scale, particularly to retain control over data.

Image of John Fanelli, VP for Software Enterprise at Nvidia, and Thomas Di Giacomo, chief technology and product officer at SUSE, at SUSECON.
John Fanelli, VP for Software Enterprise at Nvidia, and Thomas Di Giacomo, chief technology and product officer at SUSE (Source: SUSE)

Simplifying deployment and operations

SUSE AI Factory includes pre-validated architectural blueprints for common workloads, allowing enterprises to build custom deployments using SUSE and Nvidia components. The platform incorporates zero-trust security and observability features and extends SUSE Rancher Prime and SUSE Linux Enterprise Server capabilities to AI workloads.

Fanelli said the companies have jointly developed blueprints to simplify deployment. “We have been building blueprints on top to simplify things for customers,” he said.

A unified interface supports deployment across environments, from local systems to air-gapped edge clusters, while maintaining lifecycle management consistency.

SUSE said it provides a single point of support across the full stack, including Nvidia AI Enterprise components, while continuing to build an ecosystem of partners. “Customers want a complete solution,” said Dirk-Peter van Leeuwen, CEO of SUSE, adding that partnerships across hardware, software, and services are central to the company’s approach.

Sovereignty, security, and open source

SUSE has repeatedly emphasized open source as a foundation for sovereignty. The company is introducing a sovereignty specialization under its partner program to support the deployment of sovereign-compliant infrastructure through local and global partners. This includes an ecosystem approach combining SUSE’s open-source stack, managed service providers, and global system integrators to operationalize sovereignty requirements.

Van Leeuwen described a move toward a third phase of AI adoption, in which companies integrate proprietary data into AI models and require sovereign infrastructure to support it.

“Customers need to maintain control over their data and decision-making,” van Leeuwen said. He said enterprises are increasingly moving toward private and hybrid infrastructure as they deploy AI using their own data.

This trend aligns with customer concerns around vendor lock-in and external dependencies. “If the ability to switch is not built into your architecture from the beginning, adapting later becomes difficult,” said Frank Feldman, chief strategy officer at SUSE.

SUSE positions its AI Factory platform around digital sovereignty, integrating Nvidia technologies into its open-source-based infrastructure stack. The company said organizations can keep sensitive data and models within private infrastructure while using Nvidia components as part of the overall system.

“AI developers, users, and operations teams are in a catch-22 with AI. They want to innovate quickly but must secure these workloads and processes to ensure full auditability before running them in production,” said Thomas Di Giacomo, chief technology and product officer at SUSE. He said SUSE focuses on delivering a supported, integrated stack while working with partners for specific technologies.

“Sovereignty has become non-negotiable for many customers,” Di Giacomo said, adding that customers require control, transparency, and auditability, particularly in regulated environments.

Addressing the mix of open-source and proprietary elements, Di Giacomo said Nvidia is increasingly embracing open source, and that SUSE relies on partners for components that are not open. He said the company focuses on integration, service-level agreements, and delivering a supported stack across the platform.

Oxenham said most components within Nvidia AI Enterprise are open source, including key elements such as GPU drivers and operators, while some parts remain proprietary.

SUSE said the platform integrates Nvidia technologies, including NIM microservices, Nemotron models, NeMo, Run:ai, Kubernetes Operators, OpenShell runtime, and NemoClaw, alongside SUSE’s K3s.

Broader availability of SUSE AI Factory and SUSE AI Factory with Nvidia is expected later this year. Executives said the focus will remain on enabling enterprises to deploy AI factories within their own environments, particularly for regulated and data-sensitive workloads.


See also:

SUSE Extends Single-Kernel Linux Strategy from Edge to Data Center

SUSE Launches Industrial Edge Platform Following Losant Acquisition

Seraphinite AcceleratorOptimized by Seraphinite Accelerator
Turns on site high speed to be attractive for people and search engines.