Apache Ranger is a policy based security tool for Hadoop eco system tools. Ranger provides security policies for tools like HDFS, YARN, Hive, Knox, HBase and Storm. In this article we will learn how to create HDFS policy in Apache Ranger UI.
1) Create a folder in HDFS.
We will create an HDFS directory /user/hdfs/ranger to test Ranger HDFS policies. We will be creating directory /user/hdfs/ranger using hdfs user.
hdfs dfs -mkdir /user/hdfs/ranger
hdfs dfs -ls /user/hdfs/ranger
2) Try to access the same directory with Hive user.
If we try to access directory /user/hdfs/ranger using hive user, We will get permission denied error.
[hive@datanode1 ~]$ hdfs dfs -ls /user/hdfs/ranger
ls: Permission denied: user=hive, access=EXECUTE, inode="/user/hdfs/ranger":hdfs:hdfs:drwx------
We will provide access to hive user on this directory /user/hdfs/ranger using Ranger policy.
3) Enable HDFS plugin in Ranger
If HDFS plugin is not enabled, We need to enable it from Ambari .
Goto Ambari UI ---------Click Ranger--------Click Config----Click Ranger plugin ----set HDFS plugin to on
4) Define a new policy in Ranger UI
We will define new policy in Ranger UI to provide read,write and execute access to hive user on /user/hdfs/ranger directory.
Goto Ranger UI ------Click HDFS plugin -------Click add policy-------- Enter policy details ------ Click Add
Policy details are shown in below picture.
5) Access hdfs directory /user/hdfs/ranger using hive user now.
We can now test access on directory /user/hdfs/ranger with hive user.
hdfs dfs -ls /user/hdfs/ranger
Even though hive user does not have any permissions on directory /user/hdfs/ranger , hive user still able to access the folder because of HDFS policy defined in Ranger.
Similary Ranger provides centralized security policies for all hadoop tools.
No comments:
Post a Comment