splunk / splunk-shuttl

Splunk app for archive management, including HDFS support.
Apache License 2.0
36 stars 19 forks source link

Exceptions #134

Open bgsplunk opened 11 years ago

bgsplunk commented 11 years ago

Hi, I have configured shuttl on a standalone test indexer. I can see that the archive process is working and the data is being transferred to the local directory. However, I am encountering the following exception which is I think is occuring when the files are being transferred to S3. I've checked and made sure that the AWS ID/Secret Keys are valid. Any other places where I need to check? Here is the exception snippet:

Caused by: org.jets3t.service.S3ServiceException: S3 HEAD request failed for '/splunk-ad-dev%2Farchive_data%2Fcluster_name%2Fsplunk-1%2Fdev%2Fdb_1376618321_1376552276_196%2FSPLUNK_BUCKET' - ResponseCode=403, ResponseMessage=Forbidden at org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:485) at org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestHead(RestS3Service.java:652) at org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectImpl(RestS3Service.java:1556) at org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectDetailsImpl(RestS3Service.java:1492) at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1793) at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1225) at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:103) ... 64 more Caused by: org.jets3t.service.impl.rest.HttpException at org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:483) ... 70 more 2013-08-16 18:45:22,144 ERROR com.splunk.shuttl.archiver.archive.ArchiveRestHandler: did="Sent an archive bucket reuqest" happened="Got non ok http_status" expected="expected HttpStatus.SC_OK or SC_NO_CONTENT" http_status="500" bucket_name="db_1376618321_1376552276_196"

bgsplunk commented 11 years ago

Got bit by the / in the secret key. I thought this issue was fixed. Anyway I generated and used a secret key that did not have a / in it and I've gotten past this error but on to others (incremental progress :)

petterik commented 11 years ago

@bgsplunk: indeed the / in the secret key should be fixed. Which version were you using? Do you happen to have the secret key that failed and could you give me it, slightly modified maybe? :)

petterik commented 11 years ago

Seems like the slash (/) bug is about to be patched in hadoop: https://issues.apache.org/jira/browse/HADOOP-3733

bgsplunk commented 11 years ago

/BXV0pME1Am0wBbxYDliHlNOWzN+cMKA5zVtDyiT

bgsplunk commented 11 years ago

slightly modified but with the / intact