What are you looking for ?
Infinidat
Articles_top

Nvidia GTC: Pure Storage Platform for AI Expands with Nvidia OVX Server and Full-Stack AI Solutions

Achieved storage partner validation for OVX servers powered by L40S GPUs and FlashBlade//S.

Calvin Nieh Pure StorageBy Calvin Nie, senior technology, product and solutions marketing leader, Pure Storage, Inc.

 

 

 

Pure has achieved storage partner validation for Nvidia OVX servers powered by Nvidia L40S GPUs and its FlashBlade//S. Learn more about the validation and our AI-ready infrastructure solutions.

Pure Storage Nvidia Ia 1

Summary
Pure has achieved storage partner validation for Nvidia OVX servers. The addition of OVX reference architectures to AIRI and FlashStack helps make Pure the go-to provider for enterprise AI-ready infrastructure solutions.

At Nvidia GTC, a global AI conference taking place in San Jose, CA, it announced its close collaboration with Nvidia Corp. to achieve storage partner validation for Nvidia OVX servers powered by Nvidia L40S GPUs and its FlashBlade//S. This new reference architecture validation provides enterprises with greater GPU server choice and more immediate availability of proven AI infrastructure for fast and efficient small model training, fine-tuning, and inference workloads.

The addition of the company’s validated Nvidia OVX reference architectures to our recent AIRI built on Nvidia DGX BasePOD and new FlashStack for AI solutions helps make the firm the go-to provider for enterprise AI-ready infrastructure solutions.

Validated enterprise infrastructure for GenAI RAG
The company launched the 1st and original AIRI in 2018 and we’ve since provided customers with steady innovations with the introduction of FlashBlade//S (the storage foundation for AIRI), as well as Cisco Validated Designs with FlashStack for AI.

Our goal: to give customers the best options for full-stack, ready-to-run AI infrastructure.

The recent demand for new AI workloads like Generative AI and retrieval-augmented generation (RAG) to customize large language models (LLMs) for specific domain or company needs has elevated the demand for efficient AI compute, storage, and networking solutions.

These new AI use cases require high-performance and highly efficient all-flash storage to maximize GPU utilization and AI productivity. Pure’s storage platform for AI ensures that multimodal performance addresses the common payload in an AI data pipeline with performance and capacity-optimized DirectFlash technology. Our products require 80% less energy than alternatives and that shifts availability of power for more GPUs per rack.

With Pure Data storage platform for AI, customers are able to:

  1. Accelerate model training and inferencing
  2. Maximize operational efficiency
  3. Deliver cost and energy efficiency at scale
  4. Achieve ultimate reliability and future-proof AI storage

MLOps and vertical full-stack solutions
The difficulty in getting AI initiatives off the ground spans multiple domains. Fundamental challenges can be due to:

  • Lack of skilled people
  • Limited budgets
  • Legacy compute systems
  • Silos of legacy data storage that don’t keep powerful AI-optimized GPUs fully productive

Domain expertise beyond infrastructure and storage is needed around the use and deployment of MLOps applications from providers such as Red Hat OpenShift, Weights & Biases, Run:ai, Ray, and Anyscale, which integrate with Nvidia AI Enterprise software, including Nvidia NeMo and new Nvidia NIM and NeMo Retriever microservices. Some industries require vertical-specific applications, including financial services applications like KDB.AI, Raft, and Milvus. The healthcare and life sciences stack can include applications from Nvidia Clara like MONAI for medical imaging.

New validated full-stack solutions, including MLOps and vertical stacks from Pure, help fast-track IT and AI infrastructure teams with solutions that have gone through thorough testing, documentation, and integration so that AI development and data science teams can be off and running faster and with less risk.

Future-proof storage for uncertain AI growth
The Pure storage platform for AI is future-proof. Companies can invest in AI infrastructure that can grow over time without fear of storage that is out of date and can’t keep up with fast-growing AI and data science team needs. The company also provides a subscription cloud model for consumption called Evergreen//One that makes consumption of storage for AI more cloud-like but with the advantages of on-premises, including continuous innovation, financial flexibility, and operational agility.

Resource:
AI: Accelerate AI Adoption with the Pure Data Storage Platform to learn more about how Pure Storage can help you accelerate adoption of AI as well as the benefits that AI can bring to your business.

Articles_bottom
AIC
ATTO
OPEN-E