Hitachi Data Systems Delivers Next-Generation Pentaho Big Data Hyper-Converged Platform

Feb 17, 2016

Hitachi Data Systems Corporation has announced the next generation Hitachi Hyper Scale-Out Platform (HSP), which now offers native integration with the Pentaho Enterprise Platform to deliver a sophisticated, software-defined, hyper-converged platform for big data deployments. Combining compute, storage and virtualization capabilities, the HSP 400 series delivers seamless infrastructure to support big data blending, embedded business analytics and simplified data management.
HSP 400

Modern enterprises increasingly need to derive value from massive volumes of data being generated by information technology (IT), operational technology (OT), the Internet of Things (IoT) and machine-generated data in their environments. HSP offers a software-defined architecture to centralize and support easy storing and processing of these large datasets with high availability, simplified management and a pay-as-you-grow model. Delivered as a fully configured, turnkey appliance, HSP takes hours instead of months to install and support production workloads, and simplifies creation of an elastic data lake that helps customers easily integrate disparate datasets and run advanced analytic workloads.

HSP’s scale-out architecture provides simplified, scalable and enterprise-ready infrastructure for big data. The architecture also includes a centralized, easy-to-use user interface to automate the deployment and management of virtualized environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform (HDP).

Comments