Knox Gateway Proposal
Knox Gateway is a system that provides a single point of secure access for Apache Hadoop clusters.
The Knox Gateway (“Gateway” or “Knox”) is a system that provides a single point of authentication and access for Apache Hadoop services in a cluster. The goal is to simplify Hadoop security for both users (i.e. who access the cluster data and execute jobs) and operators (i.e. who control access and manage the cluster). The Gateway runs as a server (or cluster of servers) that serve one or more Hadoop clusters.
- Provide perimeter security to make Hadoop security setup easier
- Support authentication and token verification security scenarios
- Deliver users a single cluster end-point that aggregates capabilities for data and jobs
- Enable integration with enterprise and cloud identity management environments
An Apache Hadoop cluster is presented to consumers as a loose collection of independent services. This makes it difficult for users to interact with Hadoop since each service maintains it’s own method of access and security. As well, for operators, configuration and administration of a secure Hadoop cluster is a complex and many Hadoop clusters are insecure as a result.
The goal of the project is to provide coverage for all existing Hadoop ecosystem projects. In addition, the project will be extensible to allow for new and/or proprietary Hadoop components without requiring changes to the gateway source code. The gateway is expected to run in a DMZ environment where it will provide controlled access to these Hadoop services. In this way Hadoop clusters can be protected by a firewall and only limited access provided through the firewall for the gateway. The authentication components of the gateway will be modular and extensible such that it can be integrated with existing security infrastructure.
Organizations that are struggling with Hadoop cluster security result in a) running Hadoop without security or b) slowing adoption of Hadoop. The Gateway aims to provide perimeter security that integrates more easily into existing organizations’ security infrastructure. Doing so will simplify security for these organizations and benefit all Hadoop stakeholders (i.e. users and operators). Additionally, making a dedicated perimeter security project part of the Apache Hadoop ecosystem will prevent fragmentation in this area and further increase the value of Hadoop as a data platform.
Prototype available, developed by the list of initial committers.
We desire to build a diverse developer community around Gateway following the Apache Way. We want to make the project open source and will encourage contributors from multiple organizations following the Apache meritocracy model.
We hope to extend the user and developer base in the future and build a solid open source community around Gateway. Apache Hadoop has a large ecosystem of open source projects, each with a strong community of contributors. All project communities in this ecosystem have an opportunity to participate in the advancement of the Gateway project because ultimately, Gateway will enable the security capabilities of their project to be more enterprise friendly.
Gateway is currently being developed by several engineers from Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower and Sumit Mohanty. All the engineers have deep expertise in middleware, security & identity systems and are quite familiar with the Hadoop ecosystem.
The ASF is a natural host for Gateway given that it is already the home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data software projects. Gateway is designed to solve the security challenges familiar to the Hadoop ecosystem family of projects.
Orphaned products & Reliance on Salaried Developers
The core developers plan to work full time on the project. We believe that this project will be of general interest to many Hadoop users and will attract a diverse set of contributors. We intend to demonstrate this by having contributors from several organizations recognized as committers by the time Knox graduates from incubation.
Inexperience with Open Source
All of the core developers are active users and followers of open source. As well, Hortonworks and the affiliated mentors have a strong heritage of success with contributions to Apache Hadoop Projects.
The current core developers are from Hortonworks, however, we hope to establish a developer community that includes contributors from several corporations.
Reliance on Salaried Developers
Currently, the developers are paid to do work on Gateway. However, once the project has a community built around it, we expect to get committers and developers from outside the current core developers.
Relationships with Other Apache Products
Gateway is going to be used by the users and operators of Hadoop, and the Hadoop ecosystem in general.
A Excessive Fascination with the Apache Brand
Our interest in developing Gateway in Apache project is to follow an established development model, as well since many of the Hadoop ecosystem projects also are part of Apache, Gateway will complement those projects by following the same development and contribution model.
There is documentation in Hortonworks’ internal repositories. These can be shared upon request and will be transferred into the Apache CM system if this proposal is accepted.
Source and Intellectual Property Submission Plan
The complete Gateway code is under Apache Software License 2.
The Gateway dependencies are listed below, separated by Category A and Category B as defined in the Apache Third-Party Licensing Policy. Note: These are the direct dependencies. Indirect dependencies are not included.
Category A Dependencies
- Apache Commons - ASLv2.0
- Apache Hadoop - ASLv2.0
- Apache Geronimo - ASLv2.0
- Apache Shiro - ASLv2.0
- ApacheDS - ASLv2.0
- Log4J - ASLv2.0
- SL4J - MIT
- Guava - ASLv2.0
HttpClient - ASLv2.0
- Jetty - ASLv2.0
JBoss ShrinkWrap - ASLv2.0
Category A Dependencies (Test)
EasyMock - ASLv2.0
- XML Matchers - ASLv2.0
- Hamcrest - BSDv3
JsonPath - ASLv2.0
- XMLTool - ASLv2.0
- REST-assured - ASLv2.0
Category B Dependencies
- Jersey - CDDLv1.1 or GPL2wCPE
- Jerico - EPLv1.0
- Servlet - CDDLv1.0 or GPLv2
- JUnit - CPLv1.0
The Gateway uses cryptographic software indirectly as a result of having two dependencies: ApacheDS and Apache Shiro. Gateway does not include any special or custom cryptographic technologies.
ApacheDS is an ASF project and has been classified Export Commodity Control Number (ECCN) 5D002.C.1 due to it’s dependency on Bouncy Castle. More information on the ApacheDS classification can be found at http://svn.apache.org/repos/asf/directory/apacheds/trunk/installers/README
Apache Shiro is an ASF project and has been classified Export Commodity Control Number (ECCN) 5D002.C.1. More information on the Apache Shiro classification can be found at http://svn.apache.org/repos/asf/shiro/trunk/README
knox-dev AT incubator DOT apache DOT org knox-commits AT incubator DOT apache DOT org knox-user AT hms incubator apache DOT org knox-private AT incubator DOT apache DOT org
JIRA Knox (KNOX)
- Kevin Minder (kevin DOT minder AT hortonworks DOT com)
Larry McCay (lmccay AT hortonworks DOT com)
- John Speidel (jspeidel AT hortonworks DOT com)
- Tom Beerbower (tbeerbower AT hortonworks DOT com)
- Sumit Mohanty (smohanty AT hortonworks DOT com)
- Venkatesh Seetharam (venkatesh AT hortonworks DOT com)
- Kevin Minder (Hortonworks)
Larry McCay (Hortonworks)
- John Speidel (Hortonworks)
- Tom Beerbower (Hortonworks)
- Sumit Mohanty (Hortonworks)
- Venkatesh Seetharm (Hortonworks)
- Owen O'Malley (Hortonworks)
- Mahadev Konar (Hortonworks)
- Alan Gates (Hortonworks)
- Devaraj Das (Hortwonrks)
- Chris Douglas (Microsoft)
- Chris Mattmann (NASA)
- Tom White (Cloudera)
Devaraj Das (ddas AT apache DOT org)
- Owen O’Malley (omalley AT apache DOT org)
- Mahadev Konar (mahadev AT apache DOT org)
- Alan Gates (gates AT apache DOT org)
- Devaraj Das (ddas AT apache DOT org)
- Chris Douglas (cdouglas AT apache DOT org)
- Chris Mattmann (chris DOT a DOT mattmann AT jpl DOT nasa DOT gov)
- Tom White (tom DOT e DOT white AT gmail DOT com)