For a quick and easy access, you can think about using WebHCat, a REST interface for accessing HCatalog, though Hive.
Let’s assume we’re in a kerberized cluster (you cannot be in an unkerberized cluster, remember…)
First, we check which port is used, default is 50111, in the Hive / webhcat-site.xml (or in the Hive configuration within the Ambari interface)
templeton is the former name for WebHCat.
Let’s try to do a curl on webHCat to see the DDL of the default database :
[root@sandbox ~]# curl -i --negotiate -u: "http://sandbox.hortonworks.com:50111/templeton/v1/ddl/database/default"
HTTP/1.1 401 Authentication required
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 GMT; HttpOnly
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html;charset=ISO-8859-1
Content-Length: 1328
Server: Jetty(7.6.0.v20120127)
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
<title>Error 401 Authentication required</title>
</head>
<body>
<h2>HTTP ERROR: 401</h2>
<p>Problem accessing /templeton/v1/ddl/database/default. Reason:
<pre> Authentication required</pre></p>
<hr /><i><small>Powered by Jetty://</small></i>
Hmmm, obviously : we have to kinit ourselves before being able to access HCatalog.
[root@sandbox ~]# kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs
[root@sandbox ~]# curl -i --negotiate -u: "http://sandbox.hortonworks.com:50111/templeton/v1/ddl/database/default"
HTTP/1.1 401 Authentication required
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 GMT; HttpOnly
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html;charset=ISO-8859-1
Content-Length: 1328
Server: Jetty(7.6.0.v20120127)
HTTP/1.1 500 Server Error
Set-Cookie: hadoop.auth="u=hdfs&p=hdfs@HORTONWORKS.COM&t=kerberos&e=1475885113041&s=p+38gIJagH2o1pTkoGK+af3a6Ks="; Path=/; Expires=Sat, 08-Oct-2016 00:05:13 GMT; HttpOnly
Content-Type: application/json
Transfer-Encoding: chunked
Server: Jetty(7.6.0.v20120127)
{"error":"User: HTTP/sandbox.hortonworks.com@HORTONWORKS.COM is not allowed to impersonate hdfs"}
This is a fairly common message : as you’re requesting a REST Api, your request is encapsulated with the so-called SPNego token, that you can think as the “Kerberos for HTTP”.
You must then be able to authenticate with SPNego token, but also HTTP should be able to impersonate you (meaning HTTP will do the request on behalf of your username)
Those proxyuser parameters could be found in the HDFS core-site.xml :
So here, we can see HTTP can impersonate only users belonging to the group users
[root@sandbox ~]# id hdfs
uid=505(hdfs) gid=501(hadoop) groups=501(hadoop),503(hdfs)
[root@sandbox ~]# id ambari-qa
uid=1001(ambari-qa) gid=501(hadoop) groups=501(hadoop),100(users)
That’s right, hdfs doesn’t belong to that group. However, ambari-qa does ! let’s kinit ourselves to be ambari-qa.
[root@sandbox ~]# kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa
[root@sandbox ~]# curl -i --negotiate -u: "http://sandbox.hortonworks.com:50111/templeton/v1/ddl/database/default"
HTTP/1.1 401 Authentication required
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 GMT; HttpOnly
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html;charset=ISO-8859-1
Content-Length: 1328
Server: Jetty(7.6.0.v20120127)
HTTP/1.1 200 OK
Set-Cookie: hadoop.auth="u=ambari-qa&p=ambari-qa@HORTONWORKS.COM&t=kerberos&e=1475885666292&s=/WGJZIe4BRKBoI4UmxfHUv8r7MU="; Path=/; Expires=Sat, 08-Oct-2016 00:14:26 GMT; HttpOnly
Content-Type: application/json
Transfer-Encoding: chunked
Server: Jetty(7.6.0.v20120127)
{"location":"hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse","ownerType":"ROLE","owner":"public","comment":"Default Hive database","database":"default"}
That’s it, you got your DDL !
So, what do you think ?