Closed jamescarppe closed 1 year ago
Just in case it is of interest to you: I'm currently using a workaround with 3 parts:
It's working as follows:
This is all somehow cumbersome and a Portainer Business solution to that which is more straightforward would be awesome and a huge value according to my judgment.
Just in case somebody wants to test this. Here are some hints from my config and setup.
docker-compose restic.yml
:
# # # # # # # # # # # # # # # # # # # # # # # # # #
#
# Content: restic backup container
# Author: Jan Jambor
# Author URI: https://xwr.ch
#
# Required variables:
# - ${CT_DATA_BASEPATH}
# - ${AWS_ACCESS_KEY_ID}
# - ${AWS_SECRET_ACCESS_KEY}
# - ${RESTIC_PASSWORD}
# - ${RESTIC_REPOSITORY}
#
# # # # # # # # # # # # # # # # # # # # # # # # # #
services:
restic:
# https://hub.docker.com/r/restic/restic/tags
container_name: restic
hostname: restic
image: restic/restic:0.12.1
restart: 'no'
stdin_open: true # docker run -i
entrypoint: /bin/sh
# docker exec restic /usr/bin/restic init
# docker exec restic /usr/bin/restic backup /data
# docker exec restic /usr/bin/restic snapshots
# docker exec restic /usr/bin/restic restore 2d7090a8 --target /restore
environment:
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- RESTIC_PASSWORD=${RESTIC_PASSWORD}
- RESTIC_REPOSITORY=${RESTIC_REPOSITORY}
volumes:
- '${CT_DATA_BASEPATH}/nextcloud/data:/data'
- '${CT_DATA_BASEPATH}/restic/restore:/restore'
logging:
driver: "json-file"
options:
tag: "{{.ImageName}}|{{.Name}}|{{.ImageFullID}}|{{.FullID}}"
networks:
- DockerLAN
networks:
DockerLAN:
external:
name: DockerLAN
docker-compose crontab.yml
# # # # # # # # # # # # # # # # # # # # # # # # # #
#
# Content: crontab
# Author: Jan Jambor
# Author URI: https://xwr.ch
#
# Required variables:
# - ${CT_DATA_BASEPATH}
#
# # # # # # # # # # # # # # # # # # # # # # # # # #
services:
crontab:
# https://hub.docker.com/r/willfarrell/crontab/tags
container_name: crontab
hostname: crontab
image: willfarrell/crontab:1.0.0
restart: unless-stopped
volumes:
- '/var/run/docker.sock:/var/run/docker.sock:ro'
- '${CT_DATA_BASEPATH}/crontab/config/crontab_config.yml:/opt/crontab/config.yml:rw'
- '${CT_DATA_BASEPATH}/crontab/logs:/var/log/crontab:rw'
logging:
driver: "json-file"
options:
tag: "{{.ImageName}}|{{.Name}}|{{.ImageFullID}}|{{.FullID}}"
networks:
- DockerLAN
networks:
DockerLAN:
external:
name: DockerLAN
And the configuration for the backup (the example doesn't contain the Portainer data but it would be the same as the nextcloud example shown).
- name: nextcloud-file-scan
command: docker exec --user 1000 nextcloud php occ files:scan
comment: re-scan folders for newly added files
container: crontab
schedule: '30 1 * * *'
- name: nextcloud-fulltext
command: docker exec -u 1000 nextcloud php occ fulltextsearch:index
comment: re-indexing nextcloud fulltext search for documents
container: crontab
schedule: '30 2 * * *'
- name: nextcloud-restic-backup
command: /usr/bin/restic backup /data
comment: backup nextcloud data every hour
container: restic
schedule: '0 * * * *'
Just chiming in to say that I'm a Backblaze B2 customer who would like to back up my configuration there using their S3 compatible API.
I've been missing this soooooo badly! All I want to do is backup my portainer config to my truenas' s3 service.
@Fridgemagnet while I feel like this would still make for a great feature addition, once I realized that all the stack compose files are stored as plain text yaml in my docker data directory and are able to be backed up through traditional means this became less of an urgent need.
@jakemauer True to a certain degree. I'm intereseted in everything configuration-wise from Portainer as I have my compose stack yaml files saved on a network share to access via vscode desktop. I could also use [https://github.com/offen/docker-volume-backup] to backup my portainer volume to my Minio instance which I use for other container backups anyways but something built into portainer would be ideal.
We want backups to go to Tanzu provided on-prem Minio/S3
Same here, disappointed that they force S3 to be an amazon hosted bucket.
Custom end point for S3 (or rename the button to specifically state Amazon S3) should be a no brainer in 2022.
I recently setup nextcloud and they offer "Amazon S3" as an option in the External Storage App. All I had to do to link to my locally hosted Minio was omit the Amazon S3 region. I am taking a guess that it couldn't be too hard to implement in Portainer.
I'm kinda confused as to why custom S3 endpoints are never built-in to most apps by default. When you pop in an AWS region and bucket name, behind the scenes, the app is building the endpoint's actual URL for you. It should be very easy to just let that endpoint URL be modified by the user (allowing custom entries).
AWS has the vast majority of Cloud market share, and I would assume that means the vast majority of "s3-compatible" storage market share as well. Any other solution out there whether it's from another cloud provider or one hosted locally I do believe is a reverse engineering of the S3 API. It doesn't make them any less viable, but it probably does explain why this was hard-coded to begin with. I would also appreciate the ability to hit non-AWS S3 buckets. Hopefully this issue will reach enough critical mass.
I recently setup nextcloud and they offer "Amazon S3" as an option in the External Storage App. All I had to do to link to my locally hosted Minio was omit the Amazon S3 region. I am taking a guess that it couldn't be too hard to implement in Portainer.
I did the same with Storj, slightly confusing at first but it ended up working nicely. I came here to essentially add a vote to the need for this as well as add Storj to the list of possibilities to be utilized, if not just wide open endpoint(s) for s3 compatible storage.
It's really needed feature to set AWS endpoint. Because now has many services like AWS that use native aws-cli tool and only need to set unique endpoint
and region
I suppose there is no ETA for this feature?
Another vote for custom endpoints and region please. Shouldn't be hard to do (non expert view)
Plenty of people want this, and I agree it is probably not too hard since the foundation is already laid. (although definitely not good to assume that until you develop something lol)
Who wants to help me add this feature? I have plenty of api, docker and portainer knowledge as well as I am a test engineer. I am not exactly a web developer though.
Plenty of people want this, and I agree it is probably not too hard since the foundation is already laid. (although definitely not good to assume that until you develop something lol)
Who wants to help me add this feature? I have plenty of api, docker and portainer knowledge as well as I am a test engineer. I am not exactly a web developer though.
I lack the expertise to help but I will support you all the way! 😄
I need this bash script to get all docker compose files and than use duplicati to store them in backblaze
#!/bin/sh
# Path to backup location
backup_dir="/backups/compose-backups"
if [ ! -d "$backup_dir" ] ; then
mkdir "$backup_dir"
echo "Made backup dir at $backup_dir"
fi
docker pull red5d/docker-autocompose
current_time=$(date "+%Y.%m.%d-%H.%M.%S")
docker ps --format '{{.Names}}' > containers.txt
while IFS="" read -r p || [ -n "$p" ]
do
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock red5d/docker-autocompose $p > "$backup_dir/$p-$current_time.yaml"
done < containers.txt
find "$backup_dir" -name "*.yaml" -mtime +5 -exec rm {} \;
I lack the expertise to help but I will support you all the way! 😄
Ditto
Plenty of people want this, and I agree it is probably not too hard since the foundation is already laid. (although definitely not good to assume that until you develop something lol)
Who wants to help me add this feature? I have plenty of api, docker and portainer knowledge as well as I am a test engineer. I am not exactly a web developer though.
Since the S3 Backup feature is not included in the open-source version there's nothing to do except reimplement the whole s3 backup solution. Was willing to give a shot and add it to portainer but saw that unfortunately portainer BE is not open-source.
Ahh, now that would be a problem
I've also need this feature badly! The only requirement I have is to backup my portainer config to my MinIO S3 service.
Plenty of people want this, and I agree it is probably not too hard since the foundation is already laid. (although definitely not good to assume that until you develop something lol) Who wants to help me add this feature? I have plenty of api, docker and portainer knowledge as well as I am a test engineer. I am not exactly a web developer though.
Since the S3 Backup feature is not included in the open-source version there's nothing to do except reimplement the whole s3 backup solution. Was willing to give a shot and add it to portainer but saw that unfortunately portainer BE is not open-source.
Totally forgot that this is the case. Maybe it can be opened if we are lucky :)
A quick update here: the engineering team are actively working on this feature, and it is currently targeted for our next major release.
Would be super helpful to utilise Wasabi S3 storage, which we already use for backing up everything else, saves having a single outlier stored on AWS.
A quick update here: the engineering team are actively working on this feature, and it is currently targeted for our next major release.
Any news on this?
A quick update here: the engineering team are actively working on this feature, and it is currently targeted for our next major release.
Any news on this?
Barring unforeseen issues this should make it into the next major release, as per my comment above (there hasn't been a major release since I posted it).
the new version has been released already with the support for it. ~However, I failed to configure it. Can someone paste a screenshot with the configuration and maybe a MinIO stack link?~ My MinIO didn't have a region name configured, setting one and then configuring Portainer backup with it made it work.
I see #8461 all the time now. Edit: I believe that's unrelated.
the new version has been released already with the support for it. ~However, I failed to configure it. Can someone paste a screenshot with the configuration and maybe a MinIO stack link?~ My MinIO didn't have a region name configured, setting one and then configuring Portainer backup with it made it work.
I see #8461 all the time now. Edit: I believe that's unrelated.
I got the "Store in S3" automatic backup to work with my TrueNAS' Minio service by leaving "Region" blank and adding https:// before my S3 compatible hostname. In my setup the S3 hostname points to nginxproxymanager which sends that request to the correct IP and port.
Thanks to the Portainer team for implementing this. A great and welcome addition to the already existing features that Portainer has to offer.
@jamescarppe Thank you to whoever is responsible for this.
Works perfect, thanks.
Hi ! Is there a way to use export to TrueNAS MinIO with a self-generated certificate ?
A quick update here: the engineering team are actively working on this feature, and it is currently targeted for our next major release.
Any news on this?
Barring unforeseen issues this should make it into the next major release, as per my comment above (there hasn't been a major release since I posted it).
Please do let us know when this becomes available. Currently looking at Portainer from afar, but this is really the one (but major and most important) gripe I have.
Please do let us know when this becomes available. Currently looking at Portainer from afar, but this is really the one (but major and most important) gripe I have.
Thanks for following up on this! Support for S3-compatible backup locations (for example, MinIO, Ceph, Backblaze, etc) is now available in version 2.17 of Portainer Business Edition.
This issue is intended to collate the various pieces of feedback around additional backup methods and locations for the Portainer backup functionality in Portainer Business Edition in a central location for improved tracking and community engagement.
Existing tickets:
Proposed changes:
Add support for S3-compatible providers (for example MinIO, Ceph, Backblaze, etc)- now available in Portainer Business Edition 2.17!