Mellanox Delivers InfiniBand And Ethernet CloudX™ Interconnect Cloud Solution At National Computational Infrastructure

Mellanox® Technologies, Ltd. (NASDAQ: MLNX), a leading supplier of high-performance, end-to-end interconnect solutions for data center servers and storage systems, today announced that the National Computational Infrastructure (NCI), hosted at the Australia National University, selected Mellanox’s interconnect to support Australia’s national research computing service which provides world-class, high-end services to Australia’s researchers. Mellanox’s interconnect solutions allow for faster inter-node connectivity and access to storage, providing Australian researchers and scientific research organizations with critical on-demand access to NCI’s high-performance cloud.

The NCI deployment combines the Mellanox CloudX solution with Red Hat OpenStack software to support high performance workloads on a scalable and easy to manage cloud platform. CloudX simplifies and automates the orchestration of cloud platforms and reduces deployment time from days to hours. The NCI deployment is based on Mellanox 40/56 Gb/s Virtual Protocol Interconnect adapters and switches supporting both InfiniBand and Ethernet. The advanced NCI cloud also utilizes RoCE (RDMA over Converged Ethernet) to implement a full fat-tree Ethernet configuration on OpenStack.

“Selecting Mellanox for our high performance switching infrastructure has provided a flexible fabric, allowing us to take advantage of both high speed Ethernet and InfiniBand networking in the one solution,” said Allan Williams, NCI associate director, services and technology. “Mellanox’s professional services staff that assisted with installation were extremely knowledgeable and professional in the delivery of our solution.”

“We are pleased to partner with the NCI as they build a scalable, world-class, and efficient cloud platform based on our CloudX interconnect,” said Kevin Deierling, vice president of marketing at Mellanox Technologies. “NCI is the first CloudX deployment to take full advantage of RDMA, OpenStack plugins, and Hypervisors offloads delivered by our end-to-end 40GbE Ethernet and 56Gb/s InfiniBand interconnect solution.”

“High performance, private research clouds like Australia National University’s provide an early proof point where OpenStack deployments are showing compelling value,” said Mike Werner, senior director, global ecosystems at Red Hat. “We are thrilled by Australia National University’s early results with Red Hat Enterprise Linux OpenStack Platform and equally pleased that our continued collaboration with Mellanox is helping to further enterprise advancement and deployment for OpenStack in key areas like this one.”

If you liked this article you might like

Cisco's Explanations for Its Soft Guidance Only Go So Far

The Cloud is Still Merciless to Enterprise Hardware Firms, But Security is a Strong Point

Why Telecom Equipment Giants Are Struggling, but Their Suppliers Are Thriving

Apple Supplier InvenSense Could Be The Next Takeover Target in Chips

Analysts' Actions -- Arista Networks, CF Industries, Duke Energy, FireEye and More