This account is pending registration confirmation. Please click on the link within the confirmation email previously sent you to complete registration. Need a new registration confirmation email? Click here
Red Hat Summit 2013Hortonworks, a leading contributor and enabler to enterprise Apache Hadoop, and
Red Hat, Inc. (NYSE: RHT), the world’s leading provider of open source solutions, today announced an engineering collaboration to advance open source big data community projects. Working with the Apache
TM Hadoop® Community, the Hortonworks and Red Hat engineering teams will work together to accelerate the enablement of the broader file system ecosystem to be used with Apache Hadoop. The companies also announced the integration and support of Hortonworks Data Platform with Red Hat Storage, which can reduce a Hadoop cluster cost by up to 50 percent since customers can now run Hadoop directly on a POSIX-compliant storage node.
Click to Tweet: New @RedHatNews and @Hortonworks engineering pact will advance open source projects, increase enterprise adoption of #Hadoop, #RedHatStorage
The Hortonworks and Red Hat engineering effort has three main focus areas and is expected to increase the breadth of storage offerings that integrate and interoperate with Hadoop, making it possible to analyze data in place anywhere within the enterprise.
The first focus area is to enhance the Apache Ambari project, the open source project to monitor and manage Apache Hadoop to support the management of Hadoop-compatible file systems, such as GlusterFS. With this integration, users will be able to provision, deploy, monitor and manage alternative file systems with Ambari. The source code is 100 percent open and available to the entire Apache community, allowing participants to leverage these features to enable the implementation of many of the leading file systems and object stores available today.
The second focus area is the creation of generic test suites to validate compatibility between Hadoop and alternative file systems. Hortonworks and Red Hat will contribute these extensive testing blueprints to the open source community for use by any developer looking to test file system compatibility with Hadoop.