Articles in this section
Category / Section

Steps to configure Kerberos ticket and submit jobs to a secure cluster through command shell

2 mins read

Command line execution for Hadoop and ecosystem such as Pig, Sqoop, HBase and Oozie requires Kerberos ticket to work on a secure cluster.

Kerberos is a network authentication protocol. It is designed to provide strong authentication for client/server applications on the basis of 'tickets' to allow nodes communicating over a non-secure network to prove their identity to one another in a secure manner.

To get Kerberos ticket, you need to have Kerberos configuration file krb5.ini to be located in Windows directory like C:\Windows\krb5.ini in the machine where you want to work in command line.

Now update realm and KDC information in krb5.ini file.

Realm is windows server domain name and KDC is windows server hostname. You can add more realm in krb5.ini file but default realm must be one. Below is the template for krb5.ini file.

[libdefaults]

default_realm = REALM1

[realms]

REALM1 = {

kdc = kdc1

admin_server = kdc1

default_domain = kdc1

}

 

REALM2 = {

kdc = kdc2

admin_server = kdc2

default_domain = kdc2

}

[dns_domain_realm]

.kdc1= REALM1

kdc1= REALM1

.kdc2= REALM2

kdc2= REALM2

 

Note:

Where in the above template REALM1 and REALM2 are Windows Server DOMAIN NAME in upper case and KDC1 and KDC2 are respective Windows Server host name in lower case.

 

Get user ticket and submit MR Job:

Once Kerberos configuration file created in C:\Windows\krb5.ini, follow below steps to get Kerberos ticket to submit jobs.

  • Open Command prompt and set JAVA_HOME, HADOOP_HOME in environmental variable and their bin paths in path variable

or

You can open Big Data Command Prompt shipped with the Syncfusion Big data platform which has the environment variables preset.

  • To get Kerberos ticket, first set the Kerberos ticket location by running following command in command line

set krb5ccname=c:\Syncfusion\syncticket

  • Get Kerberos user ticket by using kinit username command and enter password.

 

 

  • Now you can submit Hadoop MR Jobs as in the above screenshot and similarly you can work with other Hadoop ecosystem from command line.

 

 

hadoop jar %HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar pi 16 100000

 

 

 

 

 

Did you find this information helpful?
Yes
No
Help us improve this page
Please provide feedback or comments
Comments (0)
Please sign in to leave a comment
Access denied
Access denied