Hadoop Kerberos security

2k views Asked by At

I have set up single node cluster and the kdc server as well as the clients are on the same machine. I tried all the possible options but still the same error persists. From the research i have made following changes as suggested by the answers. 1) Installed JCE jars in $JAVA_HOME/jre/lib/security folder. 2) I edited the krb5.conf file to use only aes256-cts encryption.

/etc/krb5.conf looks like below,

[logging]
 default = FILE:/var/log/krb5libs.log  
 kdc = FILE:/var/log/krb5kdc.log  
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 dns_lookup_realm = false  
 ticket_lifetime = 24h  
 renew_lifetime = 7d  
 forwardable = true  
 rdns = false  
 default_realm = EXAMPLE.COM  
 default_ccache_name = KEYRING:persistent:%{uid}  
 default_tkt_enctypes = aes256-cts  
 default_tgs_enctypes = aes256-cts  
 permitted_enctypes   = aes256-cts  
[realms]  
 EXAMPLE.COM = {
  kdc = localhost  
  admin_server = localhost  
 }  

[domain_realm]  
 localhost = EXAMPLE.COM  

/var/kerberos/krb5kdc/kdc.conf looks like below

[kdcdefaults]  
 kdc_ports = 88  
 kdc_tcp_ports = 88  

[realms]  
 EXAMPLE.COM = {  
  #master_key_type = aes256-cts  
  acl_file = /var/kerberos/krb5kdc/kadm5.acl  
  dict_file = /usr/share/dict/words  
  admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab  
  supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal camellia256-cts:normal camellia128-cts:normal   des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal  
      max_life = 24h 0m 0s  
      max_renewable_life = 7d 0h 0m 0s  
}  

The namenode and datanode start-up with the credentials that have been provided in the keytab file.After namenode and datanode started i created a principal which is already a unix user in hadoop group, namely 'hdfs', with addprinc command. Then i used kinit command (kinit hadoop), which was succesful. The klist -e command results show that the enc type is aes-256 as expected. But when i try a hadoop fs -ls / command i get below error.

Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
KinitOptions cache name is /tmp/krb5cc_1001 15/06/26 13:20:18 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "/"; destination host is: "":9000;

Help needed please.

2

There are 2 answers

3
Kumar On

It seems hadoop refers the ticket from default cache location of java, but you might created ticket in some other location using it. kinit.

Don't specify any cache location then get the ticket using kinit command and then make a try.

0
Bolke de Bruin On

The reason for the error is already in the message: your configuration lists default_ccache_name = KEYRING:persistent:%{uid} which stores credentials in a secure kernel buffer on linux. Java is not able to read this buffer and thus you will get an error.

You will need to set this to something like:

default_ccache_name = /tmp/krb5cc_%{uid}

or overwrite it with KRB5CCNAME