By offering the full spectrum of the SQL interface, and by extension, the entire ecosystem of products that support SQL, customers no longer need an army of developers to build a dashboard or run a report. Unlike competitive Hadoop distributions, Pivotal HD does this without moving data between systems or using connectors that require users to store the data twice. Pivotal HD cuts out the complexity of using Hadoop, thus expanding the platform's potential and productivity, and allowing customers to enjoy the benefits of the most cost-effective and flexible data processing platform ever developed.
HAWQ (pronounced hawk) represents the EMC Greenplum engineering effort that brings 10 years of large-scale data management research and development to the Apache Hadoop framework. Leveraging the feature richness and maturity of the industry leading Greenplum MPP analytical database, this innovation has resulted in the world's first true SQL parallel database on top of the Hadoop Distributed File System (HDFS). HAWQ is the key differentiating technology in making Pivotal HD the world's most powerful Hadoop distribution. Capabilities of note include Dynamic Pipelining, a world-class query optimizer, horizontal scaling, SQL compliant, interactive query, deep analytics, and support for common Hadoop formats.
Pivotal HD and HAWQ Deliver:
- True SQL Query Capabilities – With Pivotal HD's advanced database services (HAWQ) enterprises can now unlock the potential of Hadoop's scalable, fault-tolerant storage capabilities by bringing to bear the vast pool of "data worker" tools and languages into the Hadoop ecosystems. With Pivotal HD's support for true, SQL-standards compliant query interfaces data mining tools, SQL-trained data analysts, and standard BI tools can now easily connect to, query, and analyze data sets stored in the Hadoop file system (HDFS).
- Unprecedented Query Performance – Bringing over 10 years of parallel database processing technology to Hadoop, Pivotal HD delivers query response time improvements that are up to 600x faster than current SQL-like interfaces for Hadoop.
- Robust Operational Support – Command Center enables administrators and developers to easily install and manage large clusters from interactive web user interfaces. Command Center also exposes Command Line Interface for scripting and programmer friendly web services API for complex automation tasks. Using Command Center administrators can deploy large cluster, configure services/roles, manage services and monitor HDFS jobs and tasks.