ShareX / ShareX

ShareX is a free and open source program that lets you capture or record any area of your screen and share it with a single press of a key. It also allows uploading images, text or other types of files to many supported destinations you can choose from.
https://getsharex.com
GNU General Public License v3.0
28.11k stars 3.1k forks source link

Support for custom S3 Servers #1058

Closed Centzilius closed 6 years ago

Centzilius commented 8 years ago

I'd like to add a custom S3 Server but although this pull request states "Added support for custom Amazon S3 endpoints[...]" I can't find the option to add a custom server.

Jaex commented 8 years ago

I can't find too. Probably it means custom server supported inside codes but not on UI.

Jaex commented 8 years ago

@alanedwardes would you like to consider adding it or it is too much work?

alanedwardes commented 8 years ago

Hi,

Can you tell me a bit more about your use case?

Thanks

On Sun, Dec 27, 2015 at 10:16 AM -0800, "Jaex" notifications@github.com<mailto:notifications@github.com> wrote:

@alanedwardeshttps://github.com/alanedwardes would you like to consider adding it or it is too much work?

Reply to this email directly or view it on GitHubhttps://github.com/ShareX/ShareX/issues/1058#issuecomment-167432288.

Centzilius commented 8 years ago

@alanedwardes (Not sure if you meant me) I'd like to use a OpenStack SWIFT Server which has if I'm not mistake a S3 compatible API.

alanedwardes commented 8 years ago

@Centzilius Thanks. If there are other people wanting to do this I think it should be considered, but I'm afraid your use case is very specific, so I don't think it's worth adding to ShareX.

Out of interest, why do you want to host your own S3? It feels a bit crazy considering the reliability, scale and cost-effectiveness Amazon gives you. You don't have to manage infrastructure!

Centzilius commented 8 years ago

@alanedwardes I was planing to use OVHs Public Cloud powered by OpenStack and it's pretty cheap in my opinion https://www.ovh.de/cloud/storage/object-storage.xml (German) https://www.ovh.co.uk/cloud/storage/object-storage.xml (English)

grvr commented 8 years ago

+1 from me. There are a few services out there that use an S3 compatible API. Being able to enter a custom host would allow ShareX to work with all of them.

https://en.wikipedia.org/wiki/Amazon_S3#S3_API_and_competing_services

alanedwardes commented 8 years ago

@grvr Which service are you using?

grvr commented 8 years ago

I am using ObjSpace from Delimiter.

https://www.delimiter.com/objspace-object-storage/

umcookies commented 8 years ago

I'm in the same boat as grvr, so yeah, throwing some support behind custom S3 end points.

k0nsl commented 7 years ago

I am using a self-hosted solution which is entirely S3 compatible. I'd love to be able to use it in ShareX.

Jaex commented 7 years ago

Because I didn't wrote this uploader, I can't add it. Especially because if I manage to add it, I can't even test it.

k0nsl commented 7 years ago

I can provide you with access to my S3 instance, if you would need to test. But it doesn't sound like you know how to go about it, unless I misunderstood you.

Oh well, it would have just been a nice thing to have :)

deansheather commented 7 years ago

Is this still happening? If so, could you please add the AWS Ohio region? It doesn't seem to be available. Region code is us-east-2.

Had a look through the code and couldn't seem to find the defs.

alanedwardes commented 7 years ago

This should just need an AWS SDK NuGet package update, the region list is coming from there.

@jaex would you be able to bump the package version? I can try to get around to this within the next few days if not, or there was a breaking change.

Cheers!

ennetech commented 7 years ago

+1 i wold be great if i can use sharex with a software like minio

Jaex commented 7 years ago

If Amazon S3 was free I would write it from scratch to not have external library dependency and support all custom hosts but I can't even try free trial of it because it end years ago.

ennetech commented 7 years ago

But sharex doesn't support already dreamobjects? https://github.com/ShareX/ShareX/blob/master/ShareX.UploadersLib/FileUploaders/AmazonS3.cs

BTW if you need an S3 compatible istance i could host it for you

deansheather commented 7 years ago

S3 is dirt cheap, anyways. It'd cost you a few cents to test on a region close to where you live.

If you're looking to implement your own implementation of the S3 uploader, please support both S3 v2 and v4 signature types on uploads to custom origins. v2 is easier to implement than v4 if I recall correctly (I think especially for multi-chunk uploads), and some S3-compatible APIs only support v2 signatures (like pithos).

S3 v2 signatures are documented here, and S3 v4 signatures are documented here. You'll definitely have to implement v4 signatures, as newer AWS S3 regions don't support v2 signed requests.

Jaex commented 7 years ago

I checked my Amazon S3 account now and its free trial was end years ago but still I can use it for free not sure how.

Therefore I'm trying to implement it from scratch now without use Amazon SDK but this v4 signature looks so overwhelmingly difficult. Amazon SDK source codes for signature creation is literally thousands lines of codes :(

Jaex commented 7 years ago

Edit: I managed to fix signature issue now.

Jaex commented 7 years ago

Now I wrote Amazon S3 implementation from scratch and also added custom server support like this:

It supports signature v4 currently.

But I don't have any custom server (with account) to test it. So I'm not even sure are there any servers which can work with Amazon S3 API without require any specific changes to current codes.

I also couldn't test DreamObjects which was exists in previous implementation. Because it requires credit card to even register for trial account. So I sent email to their support and requested test account.

ennetech commented 7 years ago

If you want I could give you access to a test instance

Il 17 mar 2017 12:54 AM, "Jaex" notifications@github.com ha scritto:

Now I added custom server support like this:

https://camo.githubusercontent.com/28bfa2c95820a603df419769a4e28c73f8dcb26b/687474703a2f2f692e696d6775722e636f6d2f716d774930414c2e706e67

It supports signature v4 currently.

But I don't have any custom server (with account) to test it. So I'm not even sure are there any servers which can work with Amazon S3 API without require any specific changes.

I also couldn't test DreamObjects https://www.dreamhost.com/cloud/storage/ which was exists in previous implementation. Because it requires credit card to even register for trial account. So I sent email to their support and requested test account. Not sure are they gonna give one too.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/ShareX/ShareX/issues/1058#issuecomment-287227500, or mute the thread https://github.com/notifications/unsubscribe-auth/ABzvZJieYrlHLDVvENlXl5LyKLVh3mPtks5rmcuigaJpZM4GNZ3f .

ennetech commented 7 years ago

I've tested with lastest artifact: image

it appears to be adding unwanted prefix (the bucket name)

Note: here https://play.minio.io:9000/minio/ you can play with a public istance

deansheather commented 7 years ago

@neurogas: That means that ShareX is using virtual hosts to upload, which should be supported by most S3 implementations anyways (and would probably work with the implementation you're using if you enabled wildcard DNS for *.storage.ennetech.me). Would it be difficult to add a checkbox to disable virtual host uploads, @Jaex?

Jaex commented 7 years ago

Amazon S3 specification require bucket name in sub domain. Otherwise how storage service gonna know what bucket it is?

deansheather commented 7 years ago

Paths... I know at least for v2 it's supported, I don't have any experience with v4 at all. But, storage.ennetech.me/sharex should work just as well as sharex.storage.ennetech.me, apart from some minor changes to the signature (at least for v2).

ennetech commented 7 years ago

but the ssl certificate wouldn't be valid, i've successfully used minio with:

will test it using the dns wildcard EDIT: indeed, ssl validation problem image

I've hacked the correct domain using https://ennetech.me:9000/ as hostname and a bucket named "storage", but i've realized that minio uses v4 signature

Response: <?xml version="1.0" encoding="UTF-8"?>

SignatureDoesNotMatchThe request signature we calculated does not match the signature you provided. Check your key and signing method./ShareX/2017/03/p49k0loaa7voyre1lh313baud927txp3.png3L1373L137
Jaex commented 7 years ago

I tested now, @deansheather your endpoint format works with v4 too. But I suspect it gonna solve @neurogas problem because he getting invalid signature which means his signature implementation must be different than what Amazon S3 API using.

deansheather commented 7 years ago

I was pretty sure minio follows the S3 signature specification...

This doesn't seem right, shouldn't BucketName and Key have values?: <Key></Key><BucketName></BucketName><Resource>/ShareX/2017/03/p49k0loaa7voyre1lh313baud927txp3.png</Resource>

ennetech commented 7 years ago

This can help? http://docs.minio.io/docs/how-to-use-aws-sdk-for-net-with-minio-server

Jaex commented 7 years ago

That link shows how to use with AWS SDK library (I don't use library) and not tells anything about their specifications.

Jaex commented 7 years ago

DreamObjects gave me account to test now and it worked on first try. Which shows that my Amazon S3 implementation don't have problem working with custom storage hosts. As long as those custom hosts support V4 signature.

Jaex commented 7 years ago

Currently I'm talking with developer of Minio to solve issue.

ennetech commented 7 years ago

I'm so glad to read this

Jaex commented 7 years ago

I fixed problem with help of Minio developer but still there is one big problem. Minio not supports this header: headers["x-amz-acl"] = "public-read"; so returned URL becomes private. There is workaround to make bucket public but that require custom policy.

ennetech commented 7 years ago

No problem at all, using minio UI setting a policy is a matter of seconds

Jaex commented 7 years ago

If you mean that Minio Browser bucket policy setting. I asked is that policy also makes your file listing public and he told yes. So you need policy which will make file listing private but only reading file with url public.

ennetech commented 7 years ago

Just tested with the last sharex artifact, working perfectly, thank you!

I will investigate to hopefully solve the privacy issue of leaving the whole bucket open