FMS 2025: AIC to Showcase High-Performance AI Storage and Memory Solutions
Including F2026 server configured with 26 ScaleFlux CSD 5000 NVMe SSDs and 4 NVIDIA BlueField-3 DPUs, EB202-CP-LLM platform and Unigen's upcoming AI inference modules, and PCIe Gen6 high-density AI storage solution with H3 Platform
This is a Press Release edited by StorageNewsletter.com on August 6, 2025 at 2:01 pmAIC, Inc. will be exhibiting at FMS 2025 (Future of Memory and Storage), August 5–7 at the Santa Clara Convention Center, CA.
At booth #421, the company showcasing latest server platforms and technology partnerships – built to power the future of AI data infrastructure.
AIC featured demos and technology showcases at FMS 2025
This year, the company is putting the spotlight on AI Storage – with solutions designed to meet the evolving needs of modern data centers. From next-gen PCIe Gen6 and CXL technologies to storage architectures integrating DPUs and computational storage, AIC’s platforms are built for scalable, high-efficiency AI data pipelines.
Featured Demos and Technology Showcases :
F2026 with ScaleFlux Computational Storage
The firm‘s F2026 server, configured with 26 ScaleFlux CSD 5000 NVMe SSDs and 4 NVIDIA BlueField-3 DPUs, delivers enterprise-class AI inference performance. With built-in compression and DPU offload, the system has achieved up to 89.0GB/s write, 89.4GB/s read, and over 1.6PBe of effective usable capacity in just 2U. These results reflect outstanding storage efficiency and throughput tailored for modern AI workloads.
“AIC’s system design and integration expertise with ScaleFlux’s SSD technology is a winning combination for users building out their storage to support their AI and mission-critical applications,” said Hao Zhong, CEO, ScaleFlux.
EB202-CP-LLM AI Agent Demo with Unigen
AIC and Unigen Corp. will demonstrate an on-prem AI agent solution for SMBs using the compact, short-depth EB202-CP-LLM platform and Unigen’s upcoming AI inference modules. The 2U system supports up to 1000 TOPS of AI performance and up to 128GB of DDR5 memory, enabling real-time LLM-based responses across typical business data sources like documents, spreadsheets, and databases. The demo showcases how organizations can deploy private, low-latency AI assistants without relying on the cloud.
High-Density AI Storage and CXL Memory Sharing with H3
H3 Platform, in collaboration with AIC, will be featuring their high-density AI storage solution at Booth #421. The system features PCIe Gen6 connectivity and boosts NVMe SSD IOs performance by up to 10× – providing the bandwidth and flexibility required for GenAI, large-scale model training, and high-throughput analytics. H3 will also showcase a CXL memory sharing solution that enables up to 5TB of pooled memory across 5 servers with low-latency performance and no need for software modifications.
Micron Demo featuring AIC’s F2032 at Booth #107
In collaboration with Micron Technology, Inc., AIC’s F2032 mechanical chassis will be featured at booth #107, paired with Micron’s newly announced 6600 ION 245TB E3.L SSD – capable of delivering over 140PB of storage/rack.
AIC will also feature technology from partners at booth #421 – including VAST Data, VDURA, Seagate, Micron, and Graid Technology, with integrated solutions built around the firm’s high-performance server platforms.
Additional Featured Models:
- F2032-01-G5 – A 2U all-flash NVMe platform supporting next-gen drive technologies, designed for dense AI training datasets and real-time data streaming.
- HA2026-HC – A dual-controller hybrid storage system built for HA and tiered workloads across flash and HDD media.
- SB102-CA – A GPU-optimized, short-depth server co-developed with VDURA, for compact HPC and AI deployments.
- J4078-02-04X – A high-capacity JBOD designed for backup, archival, and dense data environments.
- SB201-SU – A modular storage server featuring AIC’s ES1 backplane, engineered for high-performance NVMe infrastructure.
- J2024-08-04X – A 2U, 24-bay JBOD platform engineered for high-throughput expansion in data-heavy environments.
- SB102-SU – A space-efficient short-depth server built for remote and edge site use cases.
“FMS is a key opportunity to show how AIC is addressing the demands of AI and next-generation data infrastructure,” said Michael Liang, president and CEO, AIC. “Our partnerships and product innovations reflect our commitment to performance, scalability, and efficiency – and solidify AIC’s leadership in the AI storage market.”