Closed ervinyang closed 7 years ago
@jbaiera could you take a look at this please?
Dear @jbaiera @clintongormley, have you fixed the bug? Or should I provide something more for you to solve it?
@ervinyang Could you provide some information about how you have HDFS set up? (distribution, version, security on/off)
Thanks!
@jbaiera
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
grant { // Allow connecting to the internet anywhere permission java.net.SocketPermission "*", "connect,resolve";
// Basic permissions needed for Lucene to work: permission java.util.PropertyPermission "", "read,write"; permission java.lang.reflect.ReflectPermission ""; permission java.lang.RuntimePermission "*";
// These two have to be spelled out a separate permission java.lang.management.ManagementPermission "control"; permission java.lang.management.ManagementPermission "monitor";
// Solr needs those: permission java.net.NetPermission ""; permission java.sql.SQLPermission ""; permission java.util.logging.LoggingPermission "control"; permission javax.management.MBeanPermission "", ""; permission javax.management.MBeanServerPermission ""; permission javax.management.MBeanTrustPermission ""; permission javax.security.auth.AuthPermission ""; permission javax.security.auth.PrivateCredentialPermission "org.apache.hadoop.security.Credentials \"*\"", "read"; permission java.security.SecurityPermission "putProviderProperty.SaslPlainServer"; permission java.security.SecurityPermission "insertProvider.SaslPlainServer"; permission javax.xml.bind.JAXBPermission "setDatatypeConverter";
// TIKA uses BouncyCastle and that registers new provider for PDF parsing + MSOffice parsing. Maybe report as bug! permission java.security.SecurityPermission "putProviderProperty.BC"; permission java.security.SecurityPermission "insertProvider.BC";
// Needed for some things in DNS caching in the JVM permission java.security.SecurityPermission "getProperty.networkaddress.cache.ttl"; permission java.security.SecurityPermission "getProperty.networkaddress.cache.negative.ttl";
// SSL related properties for Solr tests permission java.security.SecurityPermission "getProperty.ssl.*"; };
Thanks!
I'm facing the same problem. If I grant all permissions to the plugin it works. So I should happen because of missing grant permissions (for org.apache.hadoop.security.Credentials)?
@mrauter What do you mean by "If I grant all permissions to the plugin" ? Could you paste the plugin-security.policy file? We are hitting this too
@tangfl permission java.security.AllPermission;
Hi everyone,
Same problem :
sudo hubicfuse /mnt/hubic -o noauto_cache,sync_read,allow_other,uid=XXX,gid=XXX,nonempty
)./mnt/hubic/...
but when it ends, it's impossible to consult list of snapshots, neither do a new snapshot.Curl
curl -XPUT 'http://XXX.XXX.XXX.XXX:9200/_snapshot/sauvegarde/all?pretty'
Answer :
{ "error" : { "root_cause" : [ { "type" : "repository_exception", "reason" : "[sauvegarde] could not read repository data from index blob" } ], "type" : "repository_exception", "reason" : "[sauvegarde] could not read repository data from index blob", "caused_by" : { "type" : "i_o_exception", "reason" : "Repérage non permis" } }, "status" : 500 }
Log :
[2016-12-28T11:30:50,215][WARN ][r.suppressed ] path: /_snapshot/sauvegarde/all, params: {pretty=, repository=sauvegarde, snapshot=all} org.elasticsearch.transport.RemoteTransportException: [XX-XXXXXXX][XXX.XXX.XXX.XXX:9300][cluster:admin/snapshot/create] Caused by: org.elasticsearch.repositories.RepositoryException: [sauvegarde] could not read repository data from index blob at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:751) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.snapshots.SnapshotsService.createSnapshot(SnapshotsService.java:226) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:82) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:41) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.support.master.TransportMasterNodeAction.masterOperation(TransportMasterNodeAction.java:86) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$3.doRun(TransportMasterNodeAction.java:170) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:527) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-5.1.1.jar:5.1.1] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_111] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_111] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_111] Caused by: java.io.IOException: Repérage non permis at sun.nio.ch.FileChannelImpl.position0(Native Method) ~[?:?] at sun.nio.ch.FileChannelImpl.position(FileChannelImpl.java:263) ~[?:?] at sun.nio.ch.ChannelInputStream.available(ChannelInputStream.java:116) ~[?:?] at java.io.BufferedInputStream.read(BufferedInputStream.java:353) ~[?:1.8.0_111] at java.io.FilterInputStream.read(FilterInputStream.java:107) ~[?:1.8.0_111] at org.elasticsearch.common.io.Streams.copy(Streams.java:76) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.common.io.Streams.copy(Streams.java:57) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:737) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.snapshots.SnapshotsService.createSnapshot(SnapshotsService.java:226) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:82) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.admin.cluster.snapshots.create.TransportCreateSnapshotAction.masterOperation(TransportCreateSnapshotAction.java:41) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.support.master.TransportMasterNodeAction.masterOperation(TransportMasterNodeAction.java:86) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$3.doRun(TransportMasterNodeAction.java:170) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:527) ~[elasticsearch-5.1.1.jar:5.1.1] at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-5.1.1.jar:5.1.1] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_111] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_111] at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_111]
I don't know how to produce the plugin-security.policy extract.
@mrauter set " permission java.security.AllPermission ;" is also throw exception.
Hi everyone, I've managed to reproduce the same error when trying to create snapshot on hdfs from elasticsearch. Tried with ES-5.1.1 and repository-hdfs installed through elasticsearch-plugin on centos7. OpenJDK 64-Bit Server VM (build 25.111-b15, mixed mode) It worked the first time and I was able to create a first snapshot. Once done, I couldn't access to it or create any other new snapshot and error logs where just the same all time.
Caused by: java.security.AccessControlException: access denied ("javax.security.auth.PrivateCredentialPermission" "org.apache.hadoop.security.Credentials" "read") at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472) ~[?:1.8.0_111]
I tried to load setting ALL permissions on java policy, but it's like if it doesn't read the config or just ignores it.
If you need anymore info or tests, I'm happy to help. Regards
same problem
so bad
@netmanito Can you please paste the entire stack trace you see in the logs?
Hi, for the following GET request,GET _snapshot/hdfs_repository/syslog_test
, I get the following message:
[2017-03-01T08:23:44,286][WARN ][r.suppressed ] path: /_snapshot/hdfs_repository/syslog_test, params: {repository=hdfs_repository, snapshot=syslog_test} org.elasticsearch.repositories.RepositoryException: [hdfs_repository] could not read repository data from index blob at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:796) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.snapshots.SnapshotsService.getRepositoryData(SnapshotsService.java:142) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.action.admin.cluster.snapshots.get.TransportGetSnapshotsAction.masterOperation(TransportGetSnapshotsAction.java:91) [elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.action.admin.cluster.snapshots.get.TransportGetSnapshotsAction.masterOperation(TransportGetSnapshotsAction.java:50) [elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.action.support.master.TransportMasterNodeAction.masterOperation(TransportMasterNodeAction.java:87) [elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$2.doRun(TransportMasterNodeAction.java:167) [elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:596) [elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-5.2.1.jar:5.2.1] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_111] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_111] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_111] Caused by: java.io.IOException: com.google.protobuf.ServiceException: java.security.AccessControlException: access denied ("javax.security.auth.PrivateCredentialPermission" "org.apache.hadoop.security.Credentials" "read") at org.apache.hadoop.ipc.ProtobufHelper.getRemoteException(ProtobufHelper.java:47) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:580) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[?:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[?:?] at com.sun.proxy.$Proxy34.getListing(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2094) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2077) ~[?:?] at org.apache.hadoop.fs.Hdfs.listStatus(Hdfs.java:254) ~[?:?] at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1798) ~[?:?] at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1794) ~[?:?] at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1800) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1759) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1718) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:145) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:142) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobStore$4.run(HdfsBlobStore.java:136) ~[?:?] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_111] at java.security.AccessController.doPrivileged(AccessController.java:713) ~[?:1.8.0_111] at org.elasticsearch.repositories.hdfs.HdfsBlobStore.execute(HdfsBlobStore.java:133) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer.listBlobsByPrefix(HdfsBlobContainer.java:142) ~[?:?] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.listBlobsToGetLatestIndexId(BlobStoreRepository.java:917) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.latestIndexBlobId(BlobStoreRepository.java:900) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:753) ~[elasticsearch-5.2.1.jar:5.2.1] ... 10 more Caused by: com.google.protobuf.ServiceException: java.security.AccessControlException: access denied ("javax.security.auth.PrivateCredentialPermission" "org.apache.hadoop.security.Credentials" "read") at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:243) ~[?:?] at com.sun.proxy.$Proxy33.getListing(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:573) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[?:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[?:?] at com.sun.proxy.$Proxy34.getListing(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2094) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2077) ~[?:?] at org.apache.hadoop.fs.Hdfs.listStatus(Hdfs.java:254) ~[?:?] at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1798) ~[?:?] at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1794) ~[?:?] at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1800) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1759) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1718) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:145) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:142) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobStore$4.run(HdfsBlobStore.java:136) ~[?:?] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_111] at java.security.AccessController.doPrivileged(AccessController.java:713) ~[?:1.8.0_111] at org.elasticsearch.repositories.hdfs.HdfsBlobStore.execute(HdfsBlobStore.java:133) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer.listBlobsByPrefix(HdfsBlobContainer.java:142) ~[?:?] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.listBlobsToGetLatestIndexId(BlobStoreRepository.java:917) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.latestIndexBlobId(BlobStoreRepository.java:900) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:753) ~[elasticsearch-5.2.1.jar:5.2.1] ... 10 more Caused by: java.security.AccessControlException: access denied ("javax.security.auth.PrivateCredentialPermission" "org.apache.hadoop.security.Credentials" "read") at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472) ~[?:1.8.0_111] at java.security.AccessController.checkPermission(AccessController.java:884) ~[?:1.8.0_111] at java.lang.SecurityManager.checkPermission(SecurityManager.java:549) ~[?:1.8.0_111] at javax.security.auth.Subject$ClassSet.populateSet(Subject.java:1414) ~[?:1.8.0_111] at javax.security.auth.Subject$ClassSet.
(Subject.java:1372) ~[?:1.8.0_111] at javax.security.auth.Subject.getPrivateCredentials(Subject.java:767) ~[?:1.8.0_111] at org.apache.hadoop.security.UserGroupInformation.getCredentialsInternal(UserGroupInformation.java:1499) ~[?:?] at org.apache.hadoop.security.UserGroupInformation.getTokens(UserGroupInformation.java:1464) ~[?:?] at org.apache.hadoop.ipc.Client$Connection. (Client.java:436) ~[?:?] at org.apache.hadoop.ipc.Client.getConnection(Client.java:1519) ~[?:?] at org.apache.hadoop.ipc.Client.call(Client.java:1446) ~[?:?] at org.apache.hadoop.ipc.Client.call(Client.java:1407) ~[?:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) ~[?:?] at com.sun.proxy.$Proxy33.getListing(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:573) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[?:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[?:?] at com.sun.proxy.$Proxy34.getListing(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2094) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2077) ~[?:?] at org.apache.hadoop.fs.Hdfs.listStatus(Hdfs.java:254) ~[?:?] at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1798) ~[?:?] at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1794) ~[?:?] at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1800) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1759) ~[?:?] at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1718) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:145) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer$6.run(HdfsBlobContainer.java:142) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobStore$4.run(HdfsBlobStore.java:136) ~[?:?] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_111] at java.security.AccessController.doPrivileged(AccessController.java:713) ~[?:1.8.0_111] at org.elasticsearch.repositories.hdfs.HdfsBlobStore.execute(HdfsBlobStore.java:133) ~[?:?] at org.elasticsearch.repositories.hdfs.HdfsBlobContainer.listBlobsByPrefix(HdfsBlobContainer.java:142) ~[?:?] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.listBlobsToGetLatestIndexId(BlobStoreRepository.java:917) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.latestIndexBlobId(BlobStoreRepository.java:900) ~[elasticsearch-5.2.1.jar:5.2.1] at org.elasticsearch.repositories.blobstore.BlobStoreRepository.getRepositoryData(BlobStoreRepository.java:753) ~[elasticsearch-5.2.1.jar:5.2.1] ... 10 more
Also, If I restart any node, there's a connection error on start although connectivity is correct. Here's the pastebin link http://pastebin.com/GW8TDymK
Regards
I resove this problem by modify plugin source
plugin-security.policy
grant {
// Hadoop UserGroupInformation, HdfsConstants, PipelineAck clinit
permission java.lang.RuntimePermission "getClassLoader";
// UserGroupInformation (UGI) Metrics clinit
permission java.lang.RuntimePermission "accessDeclaredMembers";
permission java.lang.reflect.ReflectPermission "suppressAccessChecks";
// org.apache.hadoop.util.StringUtils clinit
permission java.util.PropertyPermission "*", "read,write";
// org.apache.hadoop.util.ShutdownHookManager clinit
permission java.lang.RuntimePermission "shutdownHooks";
// JAAS is used always, we use a fake subject, hurts nobody
permission javax.security.auth.AuthPermission "getSubject";
permission javax.security.auth.AuthPermission "doAs";
permission javax.security.auth.AuthPermission "modifyPrivateCredentials";
permission java.lang.RuntimePermission "accessDeclaredMembers";
permission java.lang.RuntimePermission "getClassLoader";
permission java.lang.RuntimePermission "shutdownHooks";
permission java.lang.reflect.ReflectPermission "suppressAccessChecks";
permission javax.security.auth.AuthPermission "doAs";
permission javax.security.auth.AuthPermission "getSubject";
permission javax.security.auth.AuthPermission "modifyPrivateCredentials";
permission java.util.PropertyPermission "*", "read,write";
permission javax.security.auth.PrivateCredentialPermission "org.apache.hadoop.security.Credentials * \"*\"", "read";
};
HdfsBlobStore.java remove new ReflectPermission("suppressAccessChecks"), new AuthPermission("modifyPrivateCredentials"), new SocketPermission("*", "connect")
<V> V execute(Operation<V> operation) throws IOException {
SecurityManager sm = System.getSecurityManager();
if (sm != null) {
// unprivileged code such as scripts do not have SpecialPermission
sm.checkPermission(new SpecialPermission());
}
if (closed) {
throw new AlreadyClosedException("HdfsBlobStore is closed: " + this);
}
try {
return AccessController.doPrivileged(new PrivilegedExceptionAction<V>() {
@Override
public V run() throws IOException {
return operation.run(fileContext);
}
});
} catch (PrivilegedActionException pae) {
throw (IOException) pae.getException();
}
}
I have solved it by add a Java Security Manager settings in jvm.options modify "plugin-security.policy":
grant {
// Hadoop UserGroupInformation, HdfsConstants, PipelineAck clinit
permission java.lang.RuntimePermission "getClassLoader";
// UserGroupInformation (UGI) Metrics clinit
permission java.lang.RuntimePermission "accessDeclaredMembers";
permission java.lang.reflect.ReflectPermission "suppressAccessChecks";
// org.apache.hadoop.util.StringUtils clinit
permission java.util.PropertyPermission "*", "read,write";
// org.apache.hadoop.util.ShutdownHookManager clinit
permission java.lang.RuntimePermission "shutdownHooks";
// JAAS is used always, we use a fake subject, hurts nobody
permission javax.security.auth.AuthPermission "getSubject";
permission javax.security.auth.AuthPermission "doAs";
permission javax.security.auth.AuthPermission "modifyPrivateCredentials";
permission java.lang.RuntimePermission "accessDeclaredMembers";
permission java.lang.RuntimePermission "getClassLoader";
permission java.lang.RuntimePermission "shutdownHooks";
permission java.lang.reflect.ReflectPermission "suppressAccessChecks";
permission javax.security.auth.AuthPermission "doAs";
permission javax.security.auth.AuthPermission "getSubject";
permission javax.security.auth.AuthPermission "modifyPrivateCredentials";
permission java.security.AllPermission;
permission java.util.PropertyPermission "*", "read,write";
permission javax.security.auth.PrivateCredentialPermission "org.apache.hadoop.security.Credentials * \"*\"", "read";
};
My policy file path is data/soft/elasticsearch-5.0.1/plugins/repository-hdfs/plugin-security.policy
so I add -Djava.security.policy=file:///data/soft/elasticsearch-5.0.1/plugins/repository-hdfs/plugin-security.policy
in "/data/soft/elasticsearch-5.0.1/config/jvm.options"
and then restart the elasticsearch,and run a command
curl -XPUT http://localhost:9200/_snapshot/my_hdfs_repository/snapshot_1?wait_for_completion=true
the result:
{"snapshot":{"snapshot":"snapshot_1","uuid":"SprY4aHXTE6crhi5duJGAQ","version_id":5000199,"version":"5.0.1","indices":["ttst","test"],"state":"SUCCESS","start_time":"2017-03-16T07:23:54.568Z","start_time_in_millis":1489649034568,"end_time":"2017-03-16T07:24:03.961Z","end_time_in_millis":1489649043961,"duration_in_millis":9393,"failures":[],"shards":{"total":10,"failed":0,"successful":10}}}
it done !
@YDHui You have included permission java.security.AllPermission;
which is a security issue (it grants everything) and your other permissions are redundant.
Any update on this? I got the same problem.
There's an open PR for it: #23439. This is not a simple issue.
Not sure if it's helpful at this point, but if you need an easy way to reproduce this problem, I ran into this right away with an out-of-the-box hadoop docker image - in fact, all I was looking to do was to give the HDFS plugin a quick test drive.
After Elasticsearch v5.4.0 is out, #23439 seems not helpful. How to reporduce?
@MrGarry2016 this is fixed in 5.4.1
@clintongormley We want to install 5.4.1 version of the plugin. How can we install that specific version of the hdfs plugin to the 5.4.0 running ES cluster?
you have to wait until it is released
@adkhare Also, you simply can't install version 5.4.1 of the plugin (when it is released) on a 5.4.0 node.
Is there a way to track when this is released?? If I subscribe to this thread would that be sufficient??
Is there a way to track when this is released?? If I subscribe to this thread would that be sufficient??
@326TimesBetter Yes, although subscribing to this thread is not sufficient yet you can track releases on the Elastic website.
@YDHui Used your solution and it worked except I didn't have to set "permission java.security.AllPermission;" in the plugin-security.policy , thereby not comprising the entire security definitions. Thanks.
By the way my system configuration is OS centos7.3.1 Docker 17.05.0-ce ES 5.4.1 hadoop/hdfs 2.8
N.B I wonder why by default the plugin-security.policy file was not detected. the JAVA_OPTS in the jvm.options file had to the trick
i.e the line -Djava.security.policy=file:///path/to/plugins/repository-hdfs/plugin-security.policy
Elasticsearch version: 5.0.1
Plugins installed:
repository-hdfs
JVM version:
OS version:
Description of the problem including expected versus actual behavior: When I create repositories, ES response
but when I create snapshot of index, it throws exception:
Steps to reproduce: 1.create repositories
2.snapshot my index
3.exception is thrown