The traction that Hadoop has achieved in the enterprise is impressive. The Apache Software Foundation, open source framework for processing and storing huge amounts of business data by breaking it up into manageable pieces that programmers can structure, move into databases, visualize and analyze (aka a foundational part of “big data” and sophisticated analytics) has significant market momentum. Indeed, coming off of ITEXPO in Las Vegas, as I wrote regarding the power of open source to drive markets in real-time business communications and SDN and NFV, it is increasingly clear that enterprises not only like what they see but are deploying open source solutions across the information communications (ICT) technology spectrum.
Given the now central place that Hadoop occupies in facilitating mission critical applications and business processes, the question that arises is how a Hadoop environment can be secured. After all, with so many organizations now down the road on moving their Hadoop Proof of Concept (POC) into production, this is a non-trivial consideration. IT professionals recognize from a risk mitigation perspective that sensitive customer and corporate data will soon massively be in their storage, computing and networking ecosystems. This includes things like credit card numbers, intellectual property, customer files, and more.
As the daily headlines scream, the bad guys know this and are watching for any new vector of vulnerability to attack to get what they want. This has raised questions about security preparedness of “E”verything, including or especially open source initiatives where it is the community that is driving innovation and where security can become an afterthought.
Particularly as it applies to Hadoop, the challenge is how to keep all of the sensitive data secure as it:
Most importantly is the challenge of securing the data while still making it available for analytics, which for most companies is why Hadoop is being implemented in the first place.
If you are evaluating a Hadoop POC and/or are in the process of moving to Hadoop and rightfully have security concerns, the webinar, Securing Hadoop: What Are Your Options?, Wed., Aug. 27 at 1 p.m. EDT, is something you are going to want to participate in. Join me and internationally recognized subject matter experts, Sudeep Venkatesh, VP Solutions Architecture, Voltage Security (News - Alert), Inc. and Vinod Nair, Senior Manager, Partner Product Management, Hortonworks, as we detail and answer your questions about security options with authentication, authorization, monitoring and data-level security for Apache Hadoop.
The session is going to be based on the use cases and architectural decisions that enable the business benefits you need to deliver on the promise of Hadoop and do so securely. Cases to be covered include:
The objective is provide participants with the knowledge on how to protect sensitive data, enable analytics without security risk, and neutralize breaches through new data-centric technologies that are easy to integrate with Hive, Sqoop, MapReduce and many other interfaces.
With the data storm already putting pressure on organizations to move to architectures that can store, process and easily handle/analyze exponentially growing amounts of data across enterprises, Hadoop, whether delivered as an on-premises solution or via the cloud, has captured the imagination of IT - although questions about security linger. They do not have to. The webinar is the place to find out why and how to mitigate the risks of making the Hadoop move.