What are you looking for ?
FMS
RAIDON

Hadoop, Greenplum and GPFS Big Data Environments Enabled by Commvault Software

Infrastructure intelligence and technology for new levels of policy-driven data management and automated DR for big data initiatives

Commvault Systems, Inc. announced technology enhancements to its integrated solutions portfolio designed to help enterprises better support and manage big data initiatives.

Rolled out as part of the newest wave of innovation in the company’s eleventh software release, company’s new technology will help bring best-practice policy and data management into projects leveraging big data environments such as Hadoop, Greenplum and Global Partition File Systems (GPFS).

According to a Gartner survey published in late 2015, more than three quarters of companies are investing in or plan to invest in big data initiatives in the next two years. Given the need for actionable and intelligent insights into data sets and file systems, organizations increasingly must scale and store information at unprecedented levels. big data initiatives leverage new approaches and technologies to store, index and analyze huge data sets, while minimizing storage requirements and driving faster outcomes. However, as companies begin these initiatives, they often forgo applying data protection and DR routines to these large data sets sitting outside their traditional systems and infrastructures due to complexity, performance and cost issues.

The new innovations in firm’s software and the Data Platform directly address these emerging customer requirements to manage big data environments. Specifically, enhanced levels of visibility into the leading common big data tools including Hadoop, Greenplum and GPFS help customers map big data implementations and architectures, providing insight for how these environments can be protected and recovered in whole or across selected nodes, components, and/or data sets. Leveraging firm’s software to manage these environments, users can now better understand the exact environment layout to drive performance, eliminate complexity, and better manage costs.

Company’s software provides companies an intelligent approach to protecting the complex infrastructure of big data initiatives with the ability to automate DR for these multi-node systems. The Data Platform further enhances the value of big data initiatives by extending data portability to and from a broad range of infrastructure options (cloud, on premise, virtualized, traditional and converged).

Companies of all sizes in all industries are quickly ramping up ‘big data’ initiatives, investing to gain business insight from exploding data volumes, yet they are often moving forward without applying sound data management and DR disciplines for such strategic projects,” said Don Foster, senior director, solution management, Commvault. “In many cases the exponential growth of these big data infrastructures outpaces the ability for these solutions to self-manage and self-heal as they were designed. Now, for the first time, customers can leverage the full power of the Commvault Software portfolio to apply best-practices for data management to Hadoop, Greenplum and GPFS environments. These innovations provide important new benefits to Commvault customers, and open up significant opportunities for Commvault as the big data market continues to grow.

Over the past several years, we’ve been watching the growth of big data and seeing how organizations are adopting new technologies to manage the influx of information,” said Phil Goodwin, research director, IDC. “The newest release of Commvault’s open data platform leverages the company’s history of data management capabilities to help provide customers with data intelligence and insight designed to make cloud deployments as seamless and cost effective as possible.

These innovations in the company’s integrated solutions portfolio are available.

Articles_bottom
ExaGrid
AIC
ATTO
OPEN-E