Nvidia GTC: DDN Unveils Next Gen of Reference Architectures for Nvidia DGX BasePOD and DGX SuperPOD
AI storage solutions streamline and simplify on-premises, hybrid and public cloud infrastructures.
This is a Press Release edited by StorageNewsletter.com on September 30, 2022 at 2:02 pmAt Nvidia GTC, DDN (DataDirect Networks, Inc.) announced its next gen of reference architectures for Nvidia DGX BasePOD and Nvidia DGX SuperPOD.
These AI-enabled storage solutions enhance the company’s position in enterprise digital transformation at scale, while simplifying by 10x the deployment and management of systems of all sizes, from proof of concept to production and expansion.
The firm deployed more than 2.5EB of AI storage in 2021 and is supporting thousands of Nvidia DGX systems deployed around the world. Its systems are performing in the most demanding production AI environments, from autonomous vehicles to finance, natural language processing, climate modeling, and drug discovery at massive scale. In addition, the firm has enabled smaller enterprise IT organizations to achieve success in their AI digital transformation initiatives.
“Our close technical and business collaboration with Nvidia is enabling enterprises WW to maximize the performance of AI applications and simplify deployment for all,” said Dr. James Coomer, VP, products, DDN. “With this next gen of reference architectures, which include DDN’s A3I AI400X2, we’re delivering significant value to customers, accelerating enterprise digital transformation programs, and providing ease of management for the most demanding data-intensive workloads.“
“Organizations modernizing their business with AI need flexible, easy-to-deploy infrastructure to address their enterprise AI challenges at any scale,” said Tony Paikeday, senior director, AI systems, Nvidia Corp. “Expanding on Nvidia’s long collaboration with DDN, the next gen of DDN’s A3I built on Nvidia DGX BasePOD provides customers with integrated AI solutions to power the success of their most innovative work.“
The company’s A3I powered by DGX BasePOD is an evolution of what was previously known as DGX POD configurations. These configurations increase flexibility and customer choice, while maintaining the users’ abilities to start deployments at small scales and grow their DGX cluster over time. Additionally, the firm is collaborating with Nvidia on vertical-specific DGX BasePOD solutions tailored to financial services, healthcare and life sciences, and natural language processing. Customers using these DGX BasePOD configurations will not only get integrated deployment and management, but also software tools including the Nvidia AI Enterprise software suite, tuned for their specific applications in order to speed up developer success.
A3I AI400X2 is an all-NVMe appliance designed to help customers extract the most value from their AI and analytics data sources, is proven in production at the largest scale and is a world’s performant and efficient building block for AI infrastructures. Configurable as all-flash or hybrid, customers can build efficient scale-out AI data pools tuned to their exact performance and capacity needs. Recent enhancements to the EXAScaler management framework have cut deployment times for appliances from 8 minutes to under 50s. This approach enables organizations to have better resource allocation predictions and allows for faster and simpler scaling, which lends to better ROI.
The DGX BasePOD reference architectures provide customers with a formula for acquiring, deploying and scaling AI infrastructure. With an environment optimized for AI workloads, organizations see faster time to insights and quicker ROI.
Standard configurations start as small as 2 DGX A100 systems and a single AI400X2 system – and can be as large as 100 of DGX systems and 10 AI400X2 systems. Organizations can enter at any size, and the extensible model lets them scale as needed with simple building blocks. Backed by the company, in AI data management, along with Nvidia technology, integration and performance testing, customers can rest assured that they will get the fastest path to AI innovation.
AI drives continued growth for DDN and customers
For more than 20 years, the company has designed, developed, deployed, and optimized systems, software and storage solutions that enable enterprises, service providers, universities and government agencies to generate more value and to accelerate time to insight from their data and information, on premises and in the cloud. The firm has expanded R&D investment by 65% in the last 2 years, which puts it among industry leaders at 20% of revenue reinvested back into R&D. At the close of the first half, the company had 16% year-over-year growth, fueled by commercial enterprise demand for optimized infrastructure for AI.
DDN at Nvidia GTC
The company was presenting virtually at Nvidia GTC, a global conference on AI and the metaverse, with the session – Selene and Beyond: Solutions for Successful SuperPODs. The session was focus on how DGX SuperPOD users can best manage their infrastructure even at extreme scales. It also explored various DGX SuperPOD use cases, including how Deloitte is working with DDN to deliver a strong foundation for developers to deploy robust self-driving technology.
Resource:
DDN reference architectures
Success Story: Nvidia Cambridge-1
Success Story: Nvidia Selene
Nvidia Blog covering DGX announcements at GTC