You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
then I connect to worker for remote debug, discover the authMethod of the user which got in org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport#addHttpAuth is SIMPLE. so that user.getKerberosPrincipal() is null.
if (userProvider != null) {
User user = userProvider.getUser();
// Add ApiKey Authentication if a key is present
if (log.isDebugEnabled()) {
log.debug("checking for token using cluster name [" + clusterName + "]");
}
if (user.getEsToken(clusterName) != null) {
HttpState state = (authSettings[1] != null ? (HttpState) authSettings[1] : new HttpState());
authSettings[1] = state;
// TODO: Limit this by hosts and ports
AuthScope scope = new AuthScope(AuthScope.ANY_HOST, AuthScope.ANY_PORT, AuthScope.ANY_REALM, EsHadoopAuthPolicies.APIKEY);
Credentials tokenCredentials = new EsApiKeyCredentials(userProvider, clusterName);
state.setCredentials(scope, tokenCredentials);
if (log.isDebugEnabled()) {
log.debug("Using detected Token credentials...");
}
EsHadoopAuthPolicies.registerAuthSchemes();
authPrefs.add(EsHadoopAuthPolicies.APIKEY);
} else if (userProvider.isEsKerberosEnabled()) {
// Add SPNEGO auth if a kerberos principal exists on the user and the elastic principal is set
// Only do this if a token does not exist on the current user.
// The auth mode may say that it is Kerberos, but the client
// could be running in a remote JVM that does not have the
// Kerberos credentials available.
if (!StringUtils.hasText(settings.getNetworkSpnegoAuthElasticsearchPrincipal())) {
throw new EsHadoopIllegalArgumentException("Missing Elasticsearch Kerberos Principal name. " +
"Specify one with [" + ConfigurationOptions.ES_NET_SPNEGO_AUTH_ELASTICSEARCH_PRINCIPAL + "]");
}
// Pick the appropriate user provider to get credentials from for SPNEGO auth
UserProvider credentialUserProvider;
if (user.isProxyUser()) {
// If the user is a proxy user, get a provider for the real
// user and capture the proxy user's name to impersonate
proxyUserProvider = user.getRealUserProvider();
runAsUser = user.getUserName();
// Ensure that this real user even has Kerberos Creds:
User realUser = proxyUserProvider.getUser();
KerberosPrincipal realPrincipal = realUser.getKerberosPrincipal();
if (realPrincipal == null) {
throw new EsHadoopIllegalArgumentException("Could not locate Kerberos Principal on real user [" +
realUser.getUserName() + "] underneath proxy user [" + runAsUser + "]");
}
if (log.isDebugEnabled()) {
log.debug("Using detected SPNEGO credentials for real user [" + realUser.getUserName() + "] to proxy as [" +
runAsUser + "]...");
}
credentialUserProvider = proxyUserProvider;
} else if (user.getKerberosPrincipal() != null) {
// Ensure that the user principal exists
if (log.isDebugEnabled()) {
log.debug("Using detected SPNEGO credentials for user [" + user.getUserName() + "]...");
}
credentialUserProvider = userProvider;
} else {
throw new EsHadoopIllegalArgumentException("Could not locate Kerberos Principal on currently logged in user.");
}
i find something useful. in org.apache.spark.executor.CoarseGrainedExecutorBackend#run call SparkHadoopUtil.get.runAsSparkUser to run a job。
SparkHadoopUtil code as those
def runAsSparkUser(func: () => Unit): Unit = {
createSparkUser().doAs(new PrivilegedExceptionAction[Unit] {
def run: Unit = func()
})
}
def createSparkUser(): UserGroupInformation = {
val user = Utils.getCurrentUserName()
logDebug("creating UGI for user: " + user)
val ugi = UserGroupInformation.createRemoteUser(user)
transferCredentials(UserGroupInformation.getCurrentUser(), ugi)
ugi
}
finally to call UserGroupInformation.create(user) default with AuthMethod.SIMPLE
public static UserGroupInformation createRemoteUser(String user) {
return createRemoteUser(user, AuthMethod.SIMPLE);
}
The text was updated successfully, but these errors were encountered:
I hava got the same error.
#1607
then I connect to worker for remote debug, discover the authMethod of the user which got in
org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport#addHttpAuth
isSIMPLE
. so thatuser.getKerberosPrincipal()
is null.i find something useful. in
org.apache.spark.executor.CoarseGrainedExecutorBackend#run
callSparkHadoopUtil.get.runAsSparkUser
to run a job。SparkHadoopUtil code as those
finally to call UserGroupInformation.create(user) default with AuthMethod.SIMPLE
The text was updated successfully, but these errors were encountered: