This account is pending registration confirmation. Please click on the link within the confirmation email previously sent you to complete registration. Need a new registration confirmation email? Click here
– Mellanox® Technologies, Ltd. (NASDAQ: MLNX), a leading supplier of end-to-end interconnect solutions for data center servers and storage systems, today announced that its FDR InfiniBand solution provides Purdue University’s Conte supercomputer with leading, scalable performance, allowing its researchers to use the most effective computational tools. By connecting Conte’s servers and Lustre-based storage with Mellanox’s FDR InfiniBand solution, consisting of Connect-IB adapters, SwitchX®-2-based switches and cables, Conte is capable of delivering a peak performance of 1.342 petaflops with message rate up to 2X higher than their previous cluster.
“The increasing complexity of science and engineering research at Purdue is driving a need for increasingly faster and scalable computational resources,” said Michael Shuey, HPC system manager at Purdue University. “Mellanox’s FDR InfiniBand solutions, and in particular their Connect-IB adapters, allow MPI codes to scale more readily than our previous systems. This enables more detailed simulations, and helps empower Purdue scientists to push the envelope on their research in weather, bioscience, materials engineering and more.”
“We are pleased to have Mellanox’s FDR InfiniBand solution as the interconnect of choice for Purdue’s Conte supercomputer, the nation's fastest university-owned supercomputer,” said Gilad Shainer, vice president of marketing at Mellanox Technologies. “Utilizing Mellanox’s Connect-IB adapters, Purdue is able to take advantage of the adapter’s leading message rate and bandwidth performance to provide its scientist with unmatched performance and capabilities to enhance and accelerate their highly-complex simulation modeling.”
Connect-IB is the world’s most scalable server and storage adapter solution for High-Performance Computing (HPC), Web 2.0, cloud, Big Data, financial services, virtualized data centers and storage environments. Connect-IB adapters deliver the highest throughput of 100Gb/s utilizing PCI Express 3.0 x16, unmatched scaling with innovative transport services, sub-microsecond latency and 137 million messages per second – 4X higher message rate over competing solutions.
Available today, Mellanox’s FDR 56Gb/s InfiniBand solution includes Connect-IB adapter cards, SwitchX®-2 based switches (from 12-port to 648-port), fiber and copper cables, and ScalableHPC accelerator and management software. Mellanox will demonstrate these performance advantages at SC’13.