Close

26.05.2023

Nvidia joins with Dell to target on-prem generative AI

Nvidia and Dell have joined forces to target the development of on-premises generative AI capabilities with their collaborative initiative, Project Helix. This partnership aims to assist enterprises in constructing and managing generative AI models within their own infrastructure. The announcement was made on Tuesday.

The companies intend to integrate their hardware and software infrastructure to provide comprehensive support for the entire generative AI lifecycle. This includes infrastructure provisioning, modeling, training, fine-tuning, application development, deployment, inference, and result optimization, as outlined in their joint statement.

Dell will contribute its optimized PowerEdge servers, such as the PowerEdge XE9680 and PowerEdge R760xa, which are designed to deliver high-performance for generative AI training and inferencing. Meanwhile, Nvidia will contribute its H100 Tensor Core GPUs and Nvidia Networking, forming the foundational infrastructure for generative AI workloads within Project Helix. Additionally, enterprises can leverage Dell’s PowerScale and Dell ECS Enterprise Object Storage for efficient storage of unstructured data, according to the companies.

In terms of software, Project Helix will incorporate Nvidia’s AI Enterprise software suite, which encompasses the NeMo large language model framework and NeMo Guardrails software for building secure generative AI chatbots.

To facilitate access to Project Helix, Dell will offer Validated Designs, which provide proven and tested configurations tailored to specific use cases. These designs will be available through traditional channels starting in July 2023 and will follow a flexible consumption model that allows for on-demand usage and pay-per-use.

Nvidia has been actively collaborating with various technology companies, including Oracle, Google Cloud, and ServiceNow, to offer AI and generative AI application development services. In March, the chip maker also announced plans to make its DGX Pods, the computing modules powering ChatGPT, available in the cloud.