VMware and NVIDIA have announced a partnership at VMworld 2020 to make AI chips more accessible for businesses.
The agreement will deliver an end-to-end enterprise platform for AI as well as a new architecture for data centers, cloud and edge systems that use NVIDIA’s data processing units (DPU).
“We are partnering with NVIDIA to bring AI to every enterprise. A true democratisation of one of the most powerful technologies,” said Pat Gelsinger, the CEO of VMware.
As part of this partnership to accelerate enterprise AI adoption, the AI software available on NVIDIA’s NGC hub will be integrated into VMware vSphere, VMware Tanzu, or virtual machines in a hybrid cloud based on VMware Cloud Foundation. This will enable businesses to manage all applications on one set of operations and deploy AI infrastructure where the data resides.
VMware’s software tools will work smoothly with Nvidia’s chips to run AI applications without ‘any kind of specialised setup,’ said Krish Prasad, the Head of VMware’s Cloud Platform Business Unit, during a press briefing.
VMware makes software that helps businesses get more work out of data center servers by slicing physical machines into “virtual” ones so that more applications can be packed onto each physical machine. Its tools are commonly used by large businesses that operate their own data centers as well as businesses that use cloud computing data centers.
By uniting accelerated computing powered by NVIDIA’s solutions and virtualisation from VMware, users will be able to run data analytics and machine learning workloads in containers or virtual machines.
Data scientists, developers and researchers will gain immediate access to the wide array of NGC’s cloud-native, GPU-optimized containers, models and industry-specific software development kits.
“AI and machine learning have quickly expanded from research labs to data centers in companies across virtually every industry and geography,” said Jensen Huang, the Founder and CEO of NVIDIA.
Machine learning enables computing systems to write software and code at a speed that is difficult to replicate by humans. This capability is rapidly spreading to data centers and becoming commercialised.
“NVIDIA DPUs will give companies the ability to build secure, programmable, software-defined data centers that can accelerate all enterprise applications at exceptional value,” added Mr. Huang.
NVIDIA’s DPUs is said to ‘pack the power of a data center infrastructure on a chip’, and will not be available for millions of virtualised servers thanks to the collaboration between VMware and NVIDIA.
This will bring advances in security and storage as well as networking that will stretch from the core to the edge of the corporate network.
“It’s the Swiss Army knife of data center infrastructure that can accelerate security, storage, networking, and management tasks, freeing up CPUs to focus on enterprise applications,” said NVIDIA in a blog post.
This results in data centers being able to handle more apps, speeding up networks in the process.
The partnership will also let VMware users train and run neural networks across multiple GPUs in public and private clouds. It also will enable them to share a single GPU across multiple jobs or users thanks to the multi-instance capabilities in the latest NVIDIA A100 GPUs.
“We’re providing the best of both worlds by bringing mature management capabilities to bare-metal systems and great performance to virtualised AI workloads,” said Kit Colbert, the Vice President and CTO of VMware’s Cloud Platform Group.
In recent years, as businesses have turned to AI for everything from speech recognition to recognizing patterns in financial data, NVIDIA’s market share in data centers has been expanding because its chips are used to speed up the processing.
“As much as people may think of Nvidia as a hardware company, we are more so a software company today. There is some very important computer science that has been done between the VMware and Nvidia teams to enable this,” said Manuvir Das, the Head of Enterprise Computing at NVIDIA.
Saving lives with AI
As AI becomes more accessible, it now has the ability to save lives.
“I can’t imagine a more impactful use of AI than healthcare. The intersection of people, disease and treatments is one of the greatest challenges of humanity, and one where AI will be needed to move the needle,” said Mr. Huang.
Among the organisations integrating their VMware and NVIDIA ecosystems is the UCSF Center for Intelligent Imaging, a leader in the development of AI and analysis tools in medical imaging, using the NVIDIA Clara healthcare application framework for AI-powered imaging, as well as VMware Cloud Foundation to support a broad range of mission critical workloads.
“AI can be used to detect disease in large patient imaging studies more rapidly than the human eye, and, with further research, this technology will enable doctors to provide the fastest, most accurate and safest diagnoses and treatments for patients,” said Christopher Hess, the Chair of Radiology and Biomedical Imaging at UCSF.
The center provides the University of California San Francisco community and academic and industry partners a critical resource for discovering, innovating and adopting AI to improve patient care.
“Bringing our NVIDIA Clara AI application frameworks and VMware Cloud Foundation together will help us expand our work in AI using a common data center infrastructure for activities such as training and research, and to help support time-sensitive urgent care diagnostics,” added Mr. Hess.
Delivering new hybrid cloud architectures
VMware and NVIDIA are also collaborating to define a new architecture for the hybrid cloud, which is purpose built for the demands of AI, machine learning, high-throughput and data-centric apps.
As part of Project Monterey, a mission to redefine hybrid cloud architecture, this new architecture is based on a network interface card known as SmartNIC, which includes NVIDIA’s programmable BlueField-2 DPU.
“The BlueField-2 SmartNIC is a fundamental building block for us because we can take advantage of its DPU hardware for better network performance and dramatically reduced cost to operate data center infrastructure,” said Mr. Colbert.
With the SmartNIC card, the combination of VMware Cloud Foundation and NVIDIA BlueField-2 will deliver expanded application acceleration beyond AI to all enterprise workloads and provide an extra layer of security through a new architecture that offloads critical data center services from the CPU to SmartNICs and programmable DPUs.
“These days the traditional security perimeter is gone. So, we believe you need to root security in the hardware of the SmartNIC to monitor servers and network traffic very fast and without performance impacts,” said Mr. Colbert.
VMware and NVIDIA will give some customers early access to the technology but did not say when it would go on sale.
Previous to this partnership, NVIDIA helped Oracle become the ‘first major cloud provider’ to make NVIDIA A100 GPUs, the world’s most powerful GPU, generally available on bare metal instances. And VMware partnered with Equinix to offer its SD-WAN Edge on Equinix Network Edge as a virtual network function
Got a story, opinion or more information on this article? Contact us at editor@w.media.
And sign up to the W.Media Newsletter to get the latest data center, cloud and cybersecurity updates!
Image credit: Source