What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

Teradata Studio
Enthusiast

What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

I tried to set the connectivity and it fails with below Error!

java.lang.Exception: Can't get Kerberos realm
	at com.teradata.datatools.hadoop.hortonworks.connectivity.HortonworksConnection.openJdbcConnection(HortonworksConnection.java:329)
	at com.teradata.datatools.hadoop.hortonworks.connectivity.HortonworksConnection.createConnection(HortonworksConnection.java:223)
	at org.eclipse.datatools.connectivity.DriverConnectionBase.internalCreateConnection(DriverConnectionBase.java:105)
	at org.eclipse.datatools.connectivity.DriverConnectionBase.open(DriverConnectionBase.java:54)
	at com.teradata.datatools.hadoop.hortonworks.connectivity.HortonworksConnection.open(HortonworksConnection.java:162)
	at com.teradata.datatools.hadoop.hortonworks.connectivity.HortonworksPingFactory.createJdbcConnection(HortonworksPingFactory.java:44)
	at com.teradata.datatools.hadoop.hortonworks.connectivity.PingJdbcJob.createTestConnection(PingJdbcJob.java:32)
	at com.teradata.datatools.hadoop.hive.connectivity.PingJob.run(PingJob.java:42)
	at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)

Here are steps I performed.

 

1. I provided REALM in Kerberos section for one of my Hive connection profile.

2. In JDBC Section, I provided JDBC Host, Port, Database

3. Tried with and without Username and Password

 

It gives above error.

I tried to give kinit at OS level as well and still the same error.

 Question: Do I have to perform kinit at OS level or Studio takes care of the same? (Reason as it asks for REALM info in Kerberos section, which is bit confusing to me).

 

If there is any good documentation available please point me to the same so I can try, also is the new feature working or is it a bug?

Any parameter or logs files or if I can enable debug to see what is causing problem and try to fix?

 

I am using Studio on OS X.

6 REPLIES
Teradata Employee

Re: What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

Teradata Employee

Re: What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

Teradata Employee

Re: What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

Question: Do I have to perform kinit at OS level or Studio takes care of the same? (Reason as it asks for REALM info in Kerberos section, which is bit confusing to me).

- Yes, you need to perform a kinit at the OS level. Studio needs to be able to grab the kerberos ticket and pass those credentials to the JDBC driver in order to make the connection with the Hadoop cluster. As for why we also ask for REALM info in the Kerberos section, this is because the kerberos realm is required as part of the JDBC Url, so we take the Kerberos Realm input in that section and append it to the JDBC url in the proper format.

 

I have not tested on a Mac kerberos client specifically, but the following steps work on a Linux kerberos client and I would expect the steps to be the same.

1. First off do you have a krb5.conf file in the etc/ directory? If not please create one and add to the directory. Here is an example of what a krb5.conf file looks like:

[libdefaults]

  default_realm = HDP271.HADOOP.TERADATA.COM

  dns_lookup_realm = false

  dns_lookup_kdc = false

  ticket_lifetime = 24h

  forwardable = true

  udp_preference_limit = 1

 

[logging]

  default = FILE:/var/opt/teradata/log/kerberos/krb5libs.log

  kdc = FILE:/var/opt/teradata/log/kerberos/krb5kdc.log

  admin_server = FILE:/var/opt/teradata/log/kerberos/kadmind.log

 

[realms]

  HDP271.HADOOP.TERADATA.COM = {

    kdc = hdp271m1.labs.teradata.com:88

    admin_server = hdp271m1.labs.teradata.com

    default_domain = hadoop.com

    dict_file = /var/lib/dict/words

  }

 

[domain_realm]

  .hadoop.com = HDP271.HADOOP.TERADATA.COM

  hadoop.com = HDP271.HADOOP.TERADATA.COM

 

[appdefaults]

  pam = {

    debug = false

    ticket_lifetime = 36000

    renew_lifetime = 36000

    forwardable = true

    krb4_convert = false

  }

2. You need to create a new file TeraJDBC.config somewhere on your system. It can be created with any basic text editor. Copy and paste the following into the file:

 com.sun.security.jgss.initiate

{

com.sun.security.auth.module.Krb5LoginModule sufficient useTicketCache=true doNotPrompt=true debug=true;

};

other

{

com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true debug=true;

};

3. Add the following lines to the TeradataStudio.ini file

-Djavax.security.auth.useSubjectsCredsOnly=false

-Dsun.security.krb5.debug=true

-Djava.security.auth.login.config=PATH_TO_TeraJDBC.config

 

(for example if you put it in your Users/user/ directory it would look like -Djava.security.auth.login.config=Users/user/TeraJDBC.config)

 

Now launch Studio and try to make the JDBC Hadoop Kerberos connection again. If you are still getting the same exception it may be a Mac Client related issue with Hadoop that I am unaware of.

One more thing you can try to get more detailed kerberos debug information is launch Studio from terminal and pipe the output to a file. When you do this, and try to make the Hadoop JDBC Kerberos Connection from within Studio, the file you piped output too should have detailed debug kerberos information in it. If you do not see any kerberos detailed debug info that might be a sign there is an issue with Java getting the Kerberos ticket from the Mac OS. This debug kerberos info is outputted because of the "-Dsun.security.krb5.debug=true' flag added to the TeradataStudio.ini in the step above.

 

Example of launching Studio from terminal and piping to a file:

/Applications/Teradata\ Studio.app/Contents/MacOS/TeradataStudio > a.txt

 

Teradata Employee

Re: What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

Question: Do I have to perform kinit at OS level or Studio takes care of the same? (Reason as it asks for REALM info in Kerberos section, which is bit confusing to me).

- Yes, you need to perform a kinit at the OS level. Studio needs to be able to grab the kerberos ticket and pass those credentials to the JDBC driver in order to make the connection with the Hadoop cluster. As for why we also ask for REALM info in the Kerberos section, this is because the kerberos realm is required as part of the JDBC Url, so we take the Kerberos Realm input in that section and append it to the JDBC url in the proper format.

 

I have not tested on a Mac kerberos client specifically, but the following steps work on a Linux kerberos client and I would expect the steps to be the same.

1. First off do you have a krb5.conf file in the etc/ directory? If not please create one and add to the directory. Here is an example of what a krb5.conf file looks like:

[libdefaults]

  default_realm = HDP271.HADOOP.TERADATA.COM

  dns_lookup_realm = false

  dns_lookup_kdc = false

  ticket_lifetime = 24h

  forwardable = true

  udp_preference_limit = 1

 

[logging]

  default = FILE:/var/opt/teradata/log/kerberos/krb5libs.log

  kdc = FILE:/var/opt/teradata/log/kerberos/krb5kdc.log

  admin_server = FILE:/var/opt/teradata/log/kerberos/kadmind.log

 

[realms]

  HDP271.HADOOP.TERADATA.COM = {

    kdc = hdp271m1.labs.teradata.com:88

    admin_server = hdp271m1.labs.teradata.com

    default_domain = hadoop.com

    dict_file = /var/lib/dict/words

  }

 

[domain_realm]

  .hadoop.com = HDP271.HADOOP.TERADATA.COM

  hadoop.com = HDP271.HADOOP.TERADATA.COM

 

[appdefaults]

  pam = {

    debug = false

    ticket_lifetime = 36000

    renew_lifetime = 36000

    forwardable = true

    krb4_convert = false

  }

2. You need to create a new file TeraJDBC.config somewhere on your system. It can be created with any basic text editor. Copy and paste the following into the file:

 com.sun.security.jgss.initiate

{

com.sun.security.auth.module.Krb5LoginModule sufficient useTicketCache=true doNotPrompt=true debug=true;

};

other

{

com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true debug=true;

};

3. Add the following lines to the TeradataStudio.ini file

-Djavax.security.auth.useSubjectsCredsOnly=false

-Dsun.security.krb5.debug=true

-Djava.security.auth.login.config=PATH_TO_TeraJDBC.config

 

(for example if you put it in your Users/user/ directory it would look like -Djava.security.auth.login.config=Users/user/TeraJDBC.config)

 

Now launch Studio and try to make the JDBC Hadoop Kerberos connection again. If you are still getting the same exception it may be a Mac Client related issue with Hadoop that I am unaware of.

One more thing you can try to get more detailed kerberos debug information is launch Studio from terminal and pipe the output to a file. When you do this, and try to make the Hadoop JDBC Kerberos Connection from within Studio, the file you piped output too should have detailed debug kerberos information in it. If you do not see any kerberos detailed debug info that might be a sign there is an issue with Java getting the Kerberos ticket from the Mac OS. This debug kerberos info is outputted because of the "-Dsun.security.krb5.debug=true' flag added to the TeradataStudio.ini in the step above.

 

Example of launching Studio from terminal and piping to a file:

/Applications/Teradata\ Studio.app/Contents/MacOS/TeradataStudio > a.txt

 

Teradata Employee

Re: What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

"

Question: Do I have to perform kinit at OS level or Studio takes care of the same? (Reason as it asks for REALM info in Kerberos section, which is bit confusing to me).

- Yes, you need to perform a kinit at the OS level. Studio needs to be able to grab the kerberos ticket and pass those credentials to the JDBC driver in order to make the connection with the Hadoop cluster. As for why we also ask for REALM info in the Kerberos section, this is because the kerberos realm is required as part of the JDBC Url, so we take the Kerberos Realm input in that section and append it to the JDBC url in the proper format.

 

I have not tested on a Mac kerberos client specifically, but the following steps work on a Linux kerberos client and I would expect the steps to be the same.

1. First off do you have a krb5.conf file in the etc/ directory? If not please create one and add to the directory. Here is an example of what a krb5.conf file looks like:

[libdefaults]

  default_realm = HDP271.HADOOP.TERADATA.COM

  dns_lookup_realm = false

  dns_lookup_kdc = false

  ticket_lifetime = 24h

  forwardable = true

  udp_preference_limit = 1

 

[logging]

  default = FILE:/var/opt/teradata/log/kerberos/krb5libs.log

  kdc = FILE:/var/opt/teradata/log/kerberos/krb5kdc.log

  admin_server = FILE:/var/opt/teradata/log/kerberos/kadmind.log

 

[realms]

  HDP271.HADOOP.TERADATA.COM = {

    kdc = hdp271m1.labs.teradata.com:88

    admin_server = hdp271m1.labs.teradata.com

    default_domain = hadoop.com

    dict_file = /var/lib/dict/words

  }

 

[domain_realm]

  .hadoop.com = HDP271.HADOOP.TERADATA.COM

  hadoop.com = HDP271.HADOOP.TERADATA.COM

 

[appdefaults]

  pam = {

    debug = false

    ticket_lifetime = 36000

    renew_lifetime = 36000

    forwardable = true

    krb4_convert = false

  }

2. You need to create a new file TeraJDBC.config somewhere on your system. It can be created with any basic text editor. Copy and paste the following into the file:

 com.sun.security.jgss.initiate

{

com.sun.security.auth.module.Krb5LoginModule sufficient useTicketCache=true doNotPrompt=true debug=true;

};

other

{

com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true debug=true;

};

3. Add the following lines to the TeradataStudio.ini file

-Djavax.security.auth.useSubjectsCredsOnly=false

-Dsun.security.krb5.debug=true

-Djava.security.auth.login.config=PATH_TO_TeraJDBC.config

 

(for example if you put it in your Users/user/ directory it would look like -Djava.security.auth.login.config=Users/user/TeraJDBC.config)

 

Now launch Studio and try to make the JDBC Hadoop Kerberos connection again. If you are still getting the same exception it may be a Mac Client related issue with Hadoop that I am unaware of.

One more thing you can try to get more detailed kerberos debug information is launch Studio from terminal and pipe the output to a file. When you do this, and try to make the Hadoop JDBC Kerberos Connection from within Studio, the file you piped output too should have detailed debug kerberos information in it. If you do not see any kerberos detailed debug info that might be a sign there is an issue with Java getting the Kerberos ticket from the Mac OS. This debug kerberos info is outputted because of the "-Dsun.security.krb5.debug=true' flag added to the TeradataStudio.ini in the step above.

 

Example of launching Studio from terminal and piping to a file:

/Applications/Teradata\ Studio.app/Contents/MacOS/TeradataStudio > a.txt

 "

Teradata Employee

Re: What is procedure to connect to Hive using Teradata Studio 16.02 with Kerberos Authentication?

Question: Do I have to perform kinit at OS level or Studio takes care of the same? (Reason as it asks for REALM info in Kerberos section, which is bit confusing to me).

Yes, you need to perform a kinit at the OS level. Studio needs to be able to grab the kerberos ticket and pass those credentials to the JDBC driver in order to make the connection with the Hadoop cluster. As for why we also ask for REALM info in the Kerberos section, this is because the kerberos realm is required as part of the JDBC Url, so we take the Kerberos Realm input in that section and append it to the JDBC url in the proper format.

 

I have not tested on a Mac kerberos client specifically, but the following steps work on a Linux kerberos client and I would expect the steps to be the same.

1. First off do you have a krb5.conf file in the etc/ directory? If not please create one and add to the directory. Here is an example of what a krb5.conf file looks like

 

[libdefaults]
  default_realm = HDP271.HADOOP.TERADATA.COM
  dns_lookup_realm = false
  dns_lookup_kdc = false
  ticket_lifetime = 24h
  forwardable = true
  udp_preference_limit = 1
 
[logging]
  default = FILE:/var/opt/teradata/log/kerberos/krb5libs.log
  kdc = FILE:/var/opt/teradata/log/kerberos/krb5kdc.log
  admin_server = FILE:/var/opt/teradata/log/kerberos/kadmind.log
 
[realms]
  HDP271.HADOOP.TERADATA.COM = {
    kdc = hdp271m1.labs.teradata.com:88
    admin_server = hdp271m1.labs.teradata.com
    default_domain = hadoop.com
    dict_file = /var/lib/dict/words
  }
 
[domain_realm]
  .hadoop.com = HDP271.HADOOP.TERADATA.COM
  hadoop.com = HDP271.HADOOP.TERADATA.COM
 
[appdefaults]
  pam = {
    debug = false
    ticket_lifetime = 36000
    renew_lifetime = 36000
    forwardable = true
    krb4_convert = false
  }

2. You need to create a TeraJDBC.config file somewhere on your system. It can be created with any basic text editor. Copy and paste the following into the file

com.sun.security.jgss.initiate
{
com.sun.security.auth.module.Krb5LoginModule sufficient useTicketCache=true doNotPrompt=true debug=true;
};
other
{
com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true debug=true;
};

 3. Add the following lines to the TeradataStudio.ini file

 

-Djavax.security.auth.useSubjectsCredsOnly=false
-Dsun.security.krb5.debug=true
-Djava.security.auth.login.config=PATH_TO_TeraJDBC.config

 for example if you put the file in your Users/user directory, it would look like

Djava.security.auth.login.config=Users/user/TeraJDBC.config

Now launch Studio and try to make the JDBC Hadoop Kerberos connection again. If you are still getting the same exception it may be a Mac client related issue with Hadoop I am unaware of.

One more thing to try to get more detailed kerberos debug information is to launch Studio from terminal and pip output to a file. When you do this, and try to make the Hadoop JDBC Kerberos connection from within Studio, the file you piped output to should have detailed debug kerberos information in it. If you do no see any kerberos detailed debug info that might be a sign there is an issue with Java getting the Kerberos ticket from the Mac OS. This debug kerberos info is outputted because of the flag

Dsun.security.krb5.debug=true

 added to the TeradataStudio.ini file.

Example of launching Studio from terminal and piping output to a file

 

/Applications/Teradata\ Studio.app/Contents/MacOS/TeradataStudio > output.txt
Tags (1)