What are you looking for ?
Advertise with us
RAIDON

Mellanox Demos 40Gb IB Adapters and Switch Solutions

15 equipment, software, and technology providers to showcase interconnect at SC08

Mellanox Technologies, Ltd. announced that its industry-leading ConnectX 40Gb/s InfiniBand adapters and InfiniScale IV-based 36-port and 324-port 40Gb/s InfiniBand switches interconnect multiple best-in-class original equipment manufacturers (OEMs), strategic technology suppliers, independent software vendors (ISVs) and end-users, forming the world’s largest 40Gb/s network demonstration. This ecosystem support of 40Gb/s InfiniBand demonstrates the market need and readiness for a high-speed interconnect that enhances end-user productivity for current and future products designs. According to IDC, InfiniBand is expected to become nearly a $1B market by 2011 with 40Gb/s InfiniBand solutions as a significant contributor starting in 2009.

mellanox_demos_40gb_ib_solutions

Mellanox’s interconnect technology continues to provide the foundation for leading clustering and storage systems. Our 40Gb/s InfiniBand solutions have shown the fastest adoption in the market due to our world-leading performance, reliability and efficiency,” said Sash Sunkara, vice president of marketing at Mellanox Technologies. “This large-scale demonstration at the SC’08 show, with the expectation to see 40Gb/s InfiniBand solutions on the TOP500 list, indicates the ever-increasing demand for high-speed networking for parallel computing and enterprise data centers in order to maximize end-user productivity.”

To reap the business benefits of their HPC investments, customers need scalable solutions that deliver the performance required to handle data-intensive workloads,” said Ed Turkel, product marketing manager, Scalable Computing and Infrastructure Organization, HP. “Mellanox’s InfiniBand technologies take full advantage of the power and performance of the HP ProLiant server platform to greatly accelerate high-performance applications in the data center.

The SCinet 40Gb/s InfiniBand network consists of switches and cabling from leading InfiniBand solutions providers, demonstrating full interoperability and robustness of the 40Gb/s InfiniBand technology. Avago, Finisar and Luxtera provided the 40Gb/s InfiniBand optical cables, up to 300m, that will connect participants across the show floor.

The technology of 40Gb/s InfiniBand is demonstrated in the HP booth with a cluster of HP ProLiant DL160 G5 servers based on quad-core Intel Xeon Processor 5400 series, connected with Mellanox 40Gb/s InfiniBand and NVIDIA GPU technology.

The network’s 3D rendering application from Scalable Graphics is based on its Direct Transport Compositor (DTC) solution. DTC is a divide and conquer parallel 3D image rendering solution. DTC makes full use of the 40Gb/s bandwidth to distribute the rendering workload of a fully detailed Boeing 777 model, courtesy of Boeing Corp, across 8 HP DL160 server nodes.

The visualization is also displayed in the following booths via the SCinet 40Gbps network: AMD, Appro, Avago, DataDirect Networks, Finisar, INRIA, Intel, Luxtera, Mellanox, Microsoft, Sun Microsystems and Voltaire.

40Gb/s InfiniBand provides a tremendous boost to our parallel rendering solutions. Compared to 20Gb/s solutions, the new 40Gb/s technology enables us to double the frame rate or the image resolution,” said Christophe Mion, Scalable Graphics CTO.

Located in Mellanox’s booth, the InfiniBand-based storage solution for the 40Gb/s network is supplied by DataDirect Networks’ S2A9900 storage platform which provides the InfiniBand-based data bandwidth for fast file transfer.

One of the major challenges when designing a large simulation cluster environment is enabling high bandwidth data movement to checkpoint operations during the process. As machines exceed Petaflop proportions, solving this bandwidth challenge becomes critical to ensure that data movement is both reliable and manageable, and that I/O cycle times are minimized–which our S2A9900 storage systems are optimized for,” said Dave Fellinger, CTO, DataDirect Networks. “The adoption of a 40Gb/s interface enables data management with fewer ports and more reliable routing.”

The entire demo setup was sponsored by the Mellanox HPC Advisory Council.

Articles_bottom
ExaGrid
AIC
ATTOtarget="_blank"
OPEN-E