티스토리 뷰
docker & k8s/kubernetes
[K8s] kerberos 인증 기반의 beeline 연결 실패 - LOOKING_UP_SERVER 오류
정선생 2024. 4. 30. 16:32반응형
kinit 을 통해서 커버로스 인증을 했고, hadoop 의 ls 명령으로 hdfs 파일 목록도 조회가 가능했는데...
특이하게 kubernetes pod 안에서 beeline 을 통해 호출하면 다음과 같은 오류가 발생했다.
동일한 방법으로 물리서버에서 테스트하면 잘 되는데, kubernetes 환경에서 뜬 pod 안에서만 문제가 되는 상황이었다.
$ beeline -u "jdbc:hive2://my-hadoop-002.com:10000/default;principal=hive/_HOST@MYHOME.COM" -e 'show databases'
...
24/04/30 10:52:52 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_402]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:96) ~[hive-exec-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:238) ~[hive-exec-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:39) ~[hive-exec-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:51) ~[hive-exec-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:48) ~[hive-exec-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_402]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_402]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) ~[hadoop-common-3.3.3.3.3.0.0-1.jar:?]
at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport.open(TUGIAssumingTransport.java:48) ~[hive-exec-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:512) ~[hive-jdbc-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:382) ~[hive-jdbc-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:285) ~[hive-jdbc-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:94) ~[hive-jdbc-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at java.sql.DriverManager.getConnection(DriverManager.java:664) ~[?:1.8.0_402]
at java.sql.DriverManager.getConnection(DriverManager.java:208) ~[?:1.8.0_402]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.Commands.connect(Commands.java:1679) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.Commands.connect(Commands.java:1573) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_402]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_402]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_402]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_402]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:57) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1480) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1517) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.connectUsingArgs(BeeLine.java:930) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:812) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1123) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1097) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:555) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:537) ~[hive-beeline-4.0.0.3.3.0.0-2.jar:4.0.0.3.3.0.0-2]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_402]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_402]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_402]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_402]
at org.apache.hadoop.util.RunJar.run(RunJar.java:328) ~[hadoop-common-3.3.3.3.3.0.0-1.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:241) ~[hadoop-common-3.3.3.3.3.0.0-1.jar:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER)
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:772) ~[?:1.8.0_402]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_402]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_402]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_402]
... 38 more
Caused by: sun.security.krb5.KrbException: Server not found in Kerberos database (7) - LOOKING_UP_SERVER
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73) ~[?:1.8.0_402]
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:226) ~[?:1.8.0_402]
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:237) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.serviceCredsSingle(CredentialsUtil.java:477) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:340) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:314) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:169) ~[?:1.8.0_402]
at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:490) ~[?:1.8.0_402]
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:695) ~[?:1.8.0_402]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_402]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_402]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_402]
... 38 more
Caused by: sun.security.krb5.Asn1Exception: Identifier doesn't match expected value (906)
at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140) ~[?:1.8.0_402]
at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65) ~[?:1.8.0_402]
at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60) ~[?:1.8.0_402]
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55) ~[?:1.8.0_402]
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:226) ~[?:1.8.0_402]
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:237) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.serviceCredsSingle(CredentialsUtil.java:477) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:340) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:314) ~[?:1.8.0_402]
at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:169) ~[?:1.8.0_402]
at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:490) ~[?:1.8.0_402]
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:695) ~[?:1.8.0_402]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_402]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_402]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_402]
... 38 more
해결방법
결론부터 말하면, /etc/hosts 에 하둡 master node 의 도메인과 ip 를 나열하면 해결되었다.
kubernetes 상의 /etc/hosts 파일의 데이터를 추가하려면 hostAliases 정보를 기입해주면 된다.
예를 들면 아래와 같다.
apiVersion: v1
kind: Pod
metadata:
name: hadoop-test-pod
spec:
hostAliases:
- ip: "10.10.10.1"
hostnames:
- "my-hadoop-001"
- "my-hadoop-001.com"
- ip: "10.10.10.2"
hostnames:
- "my-hadoop-002"
- "my-hadoop-002.com"
- ip: "10.10.10.3"
hostnames:
- "my-hadoop-003"
- "my-hadoop-003.com"
containers:
- name: hadpp-test
image: private.dockerreg.com/hadoop:latest
env:
- name: KERBEROS_KEYTAB
value: /opt/airflow/user.keytab
- name: KERBEROS_PRINCIPLE
value: user@MYHOME.com
volumeMounts:
- mountPath: /opt/airflow/dags
name: my-dags
- mountPath: /opt/airflow/logs
name: logs
- mountPath: /opt/airflow/keytab
name: keytab
imagePullSecrets:
- name: my-regcred
volumes:
- emptyDir: {}
name: my-dags
- emptyDir: {}
name: logs
- name: keytab
secret:
secretName: secret-keytab-files
반응형
'docker & k8s > kubernetes' 카테고리의 다른 글
pvc 가 Terminating 상태로 삭제가 안되는 문제 해결방법 (0) | 2024.11.07 |
---|---|
[K8S] command 에 쉘스크립트 선언 방법 : 여러줄 명령어 표현하기 (0) | 2024.08.20 |
[K8s] Completed 된 POD 재실행 하는 방법 - 팁 (0) | 2024.05.24 |
[K8s] POD 의 상태별 로그 확인 방법 - ContainerCreating / Running (0) | 2024.04.24 |
[k8s] kubectl 로 docker login 정보 등록 하는 방법 - ImagePullBackOff 로그인 문제 (0) | 2024.04.19 |
댓글