What are you looking for ?
Infinidat
Articles_top

Mellanox: 200Gb/s HDR IB Solutions

Including ConnectX-6 adapters, Quantum switches and LinkX cables and transceivers

Mellanox Technologies, Ltd. announced a 200Gb/s data center interconnect solutions.

MELLANOX_CONNECTX6_1

ConnectX-6 adapters, Quantum switches and LinkX cables and transceivers together provide a complete 200Gb/s HDR IB interconnect infrastructure for the next generation of HPC, machine learning, big data, cloud, web 2.0 and storage platforms.

These solutions enable customers and users to leverage an open, standards-based technology that maximizes application performance and scalability while minimizing overall data center TCO.

200Gb/s HDR solutions will become available in 2017.

The ability to effectively utilize the exponential growth of data and to leverage data insights to gain that competitive advantage in real time is key for business success, homeland security, technology innovation, new research capabilities and beyond. The network is a critical enabler in today’s system designs that will propel the most demanding applications and drive the next life-changing discoveries,” said Eyal Waldman, president and CEO, Mellanox. “Mellanox is proud to announce the new 200Gb/s HDR IB solutions that will deliver the world’s highest data speeds and intelligent interconnect and empower the world of data in which we live. HDR IB sets a new level of performance and scalability records while delivering the next-generation of interconnects needs to our customers and partners.

Ten years ago, when Intersect360 Research began its business tracking the HPC market, IB had just become the predominant high-performance interconnect option for clusters, with Mellanox as the leading provider,” said Addison Snell, CEO, Intersect360 Research, Inc. “Over time, IB continued to grow, and today it is the leading high-performance storage interconnect for HPC systems as well. This is at a time when high data rate applications like analytics and machine learning are expanding rapidly, increasing the need for high-bandwidth, low-latency interconnects into even more markets. HDR IB is a big leap forward and Mellanox is making it a reality at a great time.

The leadership scale science and data analytics problems we are working to solve today and in the near future require very high bandwidth linking compute nodes, storage, and analytics systems into a single problem solving environment,” said Arthur Bland, director, OLCF Project, Oak Ridge National Laboratory. “With HDR IB technology, we will have an open solution that allows us to link all of our systems at very high bandwidth.

Data movement throughout the system is a critical aspect of current and future systems. Open network technology will be a key consideration as we plan the next generation of large-scale systems, including ones that will achieve Exascale performance,” said Bronis de Supinski, CTO, Livermore Computing. “HDR IB solutions represent an important development in this technology space.

We are excited to see Mellanox continue leadership in high speed interconnects,” said Parks Fields, SSI team lead, HPC-design, Los Alamos National Laboratory. “HDR IB will provide us with the performance capabilities needed for our applications.

MELLANOX_QUANTUM_2
High-speed storage for HPC solutions are critical for maximizing performance benefits of today’s HPC, machine learning, media production and big data,” said Kurt Kuckein, director, product management, DataDirect Networks, Inc.DDN and Mellanox HDR 200Gb/s technology will enable absolute unmatched performance in high performing storage solutions for our end-customers that demand the ultimate in performance for their real-time workloads.

Whether it’s HPC, big data or cloud, Mellanox and Dell EMC HPC Systems customers will benefit from the extreme performance, scalability and first to market speed advantage of our joint end-to-end solutions,” said Jim Ganthier, SVP, validated solutions organization and HPC, Dell EMC, part of Dell Technologies. “Our collaborative innovation with Mellanox helps customers accelerate time to insights and results, utilizing an open standards-based approach and enabling their next discoveries.

Fabrics are key to high performance clusters,” said Scott Misage, VP, HPC solutions and Apollo pursuits, Hewlett Packard Enterprise. “Mellanox 200Gb/s HDR products will help our joint customers take full advantage of the scalability of HPE’s purpose-built Apollo HPC solutions, maximizing overall application efficiency for their HPC workloads.

Mellanox is not only an innovator for networking solutions but an advocate for improving data center ROI,” said Mr. Qiu Long, president, server product line, Huawei Technologies Co., Ltd.With the introduction of this new 200Gb/s HDR solution, HPC and many other demanding applications can forge ahead.

Mellanox is advancing the bandwidth, latency, and programmability of fabrics with 200Gb HDR IB solutions for the OpenPOWER ecosystem, and we are looking forward to integrating HDR IB into the OpenPOWER technology portfolio,” said Brad McCredie, VP and IBM Fellow, CTO, systems and technology group, IBM Systems. “The OpenPOWER ecosystem incorporates the best of new technologies through collaborative innovation, and we’re excited to see how ConnectX-6 and Quantum will push performance to the next level.

Mellanox has taken a quantum leap forward in data center networking with IB solutions that now provide world-class performance of 200 million messages per second,” said Mr. Leijun Hu, VP, Inspur Group Co., Ltd. In addition, the new Mellanox Quantum 200Gb/s HDR IB switches now represent the world’s fastest, most flexible switch with an extremely low latency of 90ns.

Demanding HPC workloads, such as Artificial Intelligence, requires extremely high bandwidth for enormous amounts of data crunching. HDR IB will be an increasingly important technology for the modern datacenter. Mellanox intelligent interconnect solutions are the foundation for many of our market leading HPC solutions, from big to small; we’re excited to deliver the advantages of HDR to a broader set of HPC clients running exceptionally challenging workloads,” said Scott Tease, executive director, HPC, data center group, Lenovo Enterprise Solutions, Pte Ltd.

ConnectX-6 improves bandwidth to NVIDIA GPUs resulting in better scale out solutions for HPC, deep learning, and data center applications,” said Dr. Ian Buck, VP, accelerated computing group, NVIDIA Corporation. “With integrated support for NVIDIA GPUDirect technology, Mellanox interconnect and NVIDIA’s high performance Tesla GPUs will enable direct data transfers across clusters of GPUs, essential to addressing complex and computationally intensive challenges in very diverse markets.”

Our customers are constantly looking towards the next cutting edge infrastructure that gives them the competitive advantage,” said Ken Claffey, VP and GM, cloud systems and silicon group, Seagate Technology plc. “Seagate couldn’t be more excited to embrace Mellanox’s HDR 200Gb/s capabilities that will deliver unmatched storage platforms for network-intense applications like media streaming and compute clustering.

We are thrilled to see Mellanox’s newest solutions that literally double data speeds from the previous generation,” said Mr. Chaoqun Sha, SVP, technology, Sugon Information Industry Co., Ltd. “These new solutions are not only ideal for both IB and the Ethernet standards-based protocols, but also give customers the flexibility to take advantage of Mellanox’s innovative Multi-Host technology.

The ConnectX-6 adapters include single/dual-port 200Gb/s Virtual Protocol Interconnect ports options, which double the data speed when compared to the previous generation. It also supports both the IB and the Ethernet standard protocols, and provides flexibility to connect with any CPU architecture – x86, GPU, POWER, ARM, FPGA and more. With performance at 200 million messages per second, low latency of 0.6μsec, and in-network computing engines such as MPI-Direct, RDMA, GPU-Direct, SR-IOV, data encryption as well as the company’s Multi-Host technology, ConnectX-6 will enable the most efficient compute and storage platforms in the industry.

The Quantum 200Gb/s HDR IB switch is a fast switch supporting 40-ports of 200Gb/s IB or 80-ports of 100Gb/s IB connectivity for a total of 16Tb/s of switching capacity, and with anlow latency of 90ns. It advances the support of in-network computing technology, delivers optimized and flexible routing engines, and is the most scalable switch IC available. Company’s Quantum IC will be the building block for multiple switch systems – from 40-ports of 200Gb/s or 80-ports of 100Gb/s for top-of-rack solutions – to 800-ports of 200Gb/s and 1,600-ports of 100Gb/s modular switch systems.

To complete the end-to-end 200Gb/s IB infrastructure, firm’s LinkX solutions will offer a family of 200Gb/s copper and silicon photonics fiber cables.

Articles_bottom
AIC
ATTO
OPEN-E