dustin / gopro

Tools for making the most out of GoPro Plus.
http://dustin.sallings.org/2020/04/29/gopro-plus.html
BSD 3-Clause "New" or "Revised" License
91 stars 12 forks source link

Troubleshooting Request - metadata syncing to bucket but not media #12

Open ThePVD opened 2 years ago

ThePVD commented 2 years ago

1st, thank you for this. I was looking for a solution to backup gopro videos from their portal directly to s3.

I believe I've setup lambda, sqsqueue, and bucket as required with appropriate perms. When running a gopro backup, i appear to be pushing all metadata to the s3 bucket, but no videos.

The tool is looping with this status: I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying

My function logging indicates the download req is being made:

image

SQSQueue is only showing empty recieves.

image

Wondering if this info may be able to provide a quick pointer on how to dig further. Any assistance appreciated - thank you for the tool!

dustin commented 2 years ago

Hmm... It's been a while since I've set this up. I did the lambda function in JS because it seemed pretty straightforward, but I'm not sure how easily debuggable it is.

The concept is pretty straightforward, though. It should receive that message and perform an HTTP request and pipe it to your local bucket. I'd hope that if permissions weren't set up correctly, it'd be obvious.

dustin commented 2 years ago

The good news is that I just managed to try this again and it still works fine. It's easy to misconfigure, though.

From your description, it sounds like you might just not have granted permission from the lambda role to be able to write to your S3 buckets. I recall doing something about this, but don't see that mentioned in the docs. Does something like this help?

Screen Shot 2022-06-10 at 19 56 26
ThePVD commented 2 years ago

Thank you for quick response! I did explicitly permiss the the lambda, but it is possible it may be too strict.

I am writing metadata .gz files to target bucket.

Will be in a position to t/s more later tonight. If i'm able to figure it out I may submit a PR for additional debug code or document clarity if you're into that.

dustin commented 2 years ago

@ThePVD let me know if you got anywhere with this. I'm not sure if anyone else has tried and "works for me" isn't a great answer. This particular project was my first and only time using AWS Lambda and probably my only use of SQS. I'm sure I did something I don't remember while flailing about.

I'm currently doing backups locally (currently a bit over 3 TB).

ThePVD commented 2 years ago

Thanks for followup - I hit a wall and have set down for a few days. I am successfully doing local backups, so that piece is working. I've added some debug code to the lambda to confirm invocation parameters, and i do see it receiving signed links of the target media...it's just not actually pulling it into S3. I did attempt to give global S3 permissions to the function role and did not see any change.

-tool is indexing all media in gopro portal correctly -tool is pushing all metadata to S3 -tool is triggering lambda function -lambda function receives appropriate parameters on invocation <something happens, but not putting video into s3 bucket> -lambda function informs sqs queue of completion -tool reads sqsqueue and thinks upload has finished

I was going to attempt to completely replace / rebuild the lambda function code with the same inputs to see if I could see any change, but am a little over my skis in debugging.

kuanche commented 1 year ago

hi!

thanks so much for building & sharing this. this is exactly what i needed. in my case, my gopro plus account is up for renewal soon and i'd like to cancel it. before i do, i need to nab all my media, so trying to use gopro backup.

it's my first time using AWS, and i thinkkkkk i set it up as instructed (thanks for the instructions!). i'm getting a similar error, but eventually it does sometimes say Processing 1 responses (see below).

i'm in a little over my head with knowing how to debug, but happy to try whatever you'd recommend, if you have any suggestions!

thanks! kuan

p.s. i see from your profile that you're in maui? i used to live up the hill from airport beach for 6 months, and just past the cemetery in paia for a few more :)


I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 0 responses
I: Waiting for 5 files to finish copying
I: Processing 1 responses
I: Waiting for 4 files to finish copying
I: Processing 1 responses
I: Waiting for 3 files to finish copying
I: Processing 1 responses
I: Waiting for 2 files to finish copying
I: Processing 1 responses
I: Waiting for 1 files to finish copying
I: Processing 1 responses
I: Not waiting for any results
dustin commented 1 year ago

The processing responses thing makes sense. Though perhaps the "0" messages should be debug.

The way this works involves a bit of polling on the client side. All the transfer work is pushed up into AWS to be processed asynchronously via lambda. So the 5 in the top of that log is how many items it has outstanding and the 0 is how many responses it got back at the end of a completion polling. I don't see anything particularly strange in this log.

However, I don't actually use this myself. I run backuplocal into a big USB hard drive. My usage has grown quite a bit:

gopro

It'd be too expensive for me to keep all that in S3.

Your local database can tell you all of the things that you know about and you should be able to confirm your assets are properly moved into S3.

Let me know if you have any trouble running this. Local backups examine the filesystem to determine what needs to be updated, but S3 backups keep local state because the traversal can take a while. Something like rclone can build you a manifest of an s3 bucket to validate the content is available.

Mahalo! :)