What are you looking for ?
Infinidat
Articles_top

Mellanox Demonstrates Accelerated NVMe Over Fabrics With 25/50/100GbE RDMA

Four-node Windows Server 2016 cluster achieves 80GB/s.

Mellanox Technologies, Ltd. announced a way to accelerate NVMe Over Fabrics (NVMeoF) at the 2016 Intel Developers Forum.

PM1725 NVMe SSD

SAMSUNG_PM1725_SSD

 The solution, which introduces the company’s 25/100GbE networking, also features a Samsung All-Flash Array Reference Design highlighted by Samsung’s PM1725 NVMe SSD.

MELLANOX_ConnectX-4_IC
At the conference, the firm showcased an all-flash storage platform that supports both the standardized NVMeoF storage protocol with the 25 and 100GbE speeds, accelerated with RDMA over Converged Ethernet (RoCE) technology. The demonstration features 24 Samsung NVMe SSDs, a component in delivering a 21.5GB/s of sequential throughput and 4.3 million 4KB random read IO/s.

It also showcases the ConnectX-4 and ConnectX-4 Lx and Spectrum products, which are adapters and switch product family to support 25, 50, and 100GbE speeds.

MELLANOX_ConnectX-4_Lx-IC
The company also demonstrated record performance with a Windows Storage Spaces Direct all-flash, hyper-converged infrastructure. The four-node Windows Server 2016 cluster achieves 80 Gb/s)of combined throughput and uses Dell, Inc.‘s servers, PM1725 SSDs, and as 100GbE network.

With the increased speed of some of our most advanced SSDs and the more efficient storage protocols such as NVMe Over Fabrics, faster network speeds are definitely needed,” said Michael Williams, VP, memory product planning, Samsung Semiconductor, Inc.We are happy to see vendors such as Mellanox support higher-speed networking and work with Samsung SSDs to support NVMe Over Fabrics.”

“Mellanox is seeing rapid growth in the use of solid-state storage in all industries and unprecedented interest in ways to share it,” said Kevin Deierling, VP, marketing, Mellanox. “The continuous innovation in non-volatile memory performance by vendors such as Samsung is one of the most important reasons for this growth, and a major driver behind our development of an end-to-end networking solution with the highest-performing 25, 50, and 100GbE adapters, switches and cables.

Multiple demonstrations of Mellanox networking throughout IDF:

  • 25, 50, and 100GbE networking and RoCE

  • All-flash array reference design with NVMe Over Fabrics

  • Windows Storage Spaces Direct, all-flash, hyperconverged Infrastructure

Resources:
ConnectX-5 Ethernet Adapters
Samsung All-Flash Array Reference Design
Mellanox demonstrates Windows Server 2016 Storage Spaces Direct
ConnectX-4 and ConnectX-4 Lx
Spectrum Ethernet Switches

 

Articles_bottom
AIC
ATTO
OPEN-E