Nvidia to open-source Run:ai, the software it acquired for $700M to help companies manage GPUs for AI

Nvidia to open-source Run:ai, the software it acquired for $700M to help companies manage GPUs for AI

2 minutes, 4 seconds Read

Nvidia is acquiring Run.AI.

Nvidia is acquiring Run.AI.

Image Credit: Run.AI

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Nvidia has completed its acquisition of Run:ai, a software company that makes it easier for customers to orchestrate GPU clouds for AI, and said that it would open-source the software.

The purchase price wasn’t disclosed, but was pegged by reports at $700 million when Nvidia first reported its intent to close the deal in April. Run:ai posted the deal news on its website today and also said that Nvidia plans to open-source the software. Run:ai’s software remotely schedules Nvidia GPU resources for AI in the cloud.

Neither company explained why Run:ai will open-source its platform, but it’s probably not hard to figure out. Since Nvidia has grown to be the number one maker of AI chips, its stock price has soared to $3.56 trillion, making it the most valuable company in the world. That’s great for Nvidia, but it makes it hard for it to acquire companies because of antitrust oversight.

A spokesperson for Nvidia said in a statement only that “We’re delighted to welcome the Run:ai team to Nvidia.”

When Microsoft acquired Activision Blizzard for $68.7 billion, it appeased antitrust regulators by licensing Activision’s Call of Duty game to other platforms for a decade to address worries that the company would become too powerful in gaming. The same might be happening here.

Run:ai founders Omri Geller and Ronen Dar said in a press release that open-sourcing its software will help the community build better AI, faster.

“While Run:ai currently supports only Nvidia GPUs, open-sourcing the software will enable it to extend its availability to the entire AI ecosystem,” Geller and Dar said.

They said they will continue to help their customers to get the most out of their AI Infrastructure and offer the ecosystem maximum flexibility, efficiency and utilization for GPU systems, wherever they are: on-prem, in the cloud through native solutions, or on Nvidia DGX Cloud, co-engineered with leading CSPs.

The founders also said, “True to our open-platform philosophy, as part of Nvidia, we will keep empowering AI teams with the freedom to choose the tools, platforms, and frameworks that best suit their needs. We will continue to strengthen our partnerships and work alongside the ecosystem to
deliver a wide variety of AI solutions and platform choices.”

The Israel-based company said i

Read More

Similar Posts