Apeche Hadoop is a distributed system for both storing large amount of data and processing that data in parallel. Adding security to Hadoop is challenging, and it needs an option in place allowing it to be configured secure.
SafeNet's ProtectFile for Linux includes the transparent and seamless encryption of sensitive data at rest in Apache Hadoop clusters. With this solution, organizations that have deployed, or are planning to deploy Hadoop to support growing volumes of big data can now secure high-value information, without impacting Hadoop performance or the end-user experience.
Hadoop is deployed in many di?erent organizations. It offers scalable, cost-effective storage, and fast processing of large data sets.
Hadoop lacks the ability to completely secure the data at rest residing in the nodes of those clusters. Each unprotected data node represents a potential entry point for a rogue insiders or malicious threats, and leaves sensitive data in clear view should an unauthorized user or service gain access.
Speaking more about the product, Todd Moore, vice president of Encryption Products, SafeNet, said in a statement, "With the volume of data that a company generates growing exponentially and the number of breaches on the rise, security must be a priority with Hadoop deployments. ProtectFile enables companies of every size to lock down sensitive data at rest in clusters without impacting Hadoop performance, and can also support compliance mandates, such as PCI (News - Alert) DSS and HIPAA, in big data implementations."
ProtectFile for Linux offers a complete security solution for Hadoop by providing transparent and seamless encryption, and features automation tools for fast, easy roll-out and standard deployment to multiple data nodes in a Hadoop cluster. It offers centralized key and policy management, so organizations have complete control over encryption keys, as well as the ability to define and enforce granular access controls to protect against unauthorized access.