This custom integration provides a service for interacting with S3 including uploading files to a bucket or copying them within and between buckets.
Create your S3 bucket via the AWS console, remember bucket names must be unique. I created a bucket with the default access settings (allpublic OFF) and created a bucket name with format my-bucket-ransom_number
with random_number
generated on this website.
Note for a local and self-hosted alternative checkout the official Minio integration.
Place the custom_components
folder in your configuration directory (or add its contents to an existing custom_components folder). Add to your Home Assistant configuration UI or add to your configuration.yaml
:
s3:
aws_access_key_id: AWS_ACCESS_KEY
aws_secret_access_key: AWS_SECRET_KEY
region_name: eu-west-1 # optional region, default is us-east-1
The s3 entity exposes a put
service for uploading files to S3.
Example data for service call:
{
"bucket": "my_bucket",
"key": "my_key/file.jpg",
"file_path": "/some/path/file.jpg",
"storage_class": "STANDARD_IA" # optional
"content_type" : "image/jpeg" # optional
"tags": "tag1=aTagValue&tag2=anotherTagValue" # optional
}
The s3 entity exposes a copy
service for moving files around in S3.
Example data for service call:
{
"bucket": "my_bucket",
"key_source": "my_key/file_source.jpg",
"key_destination": "my_key/file_destination.jpg"
}
If you need to move items between buckets use this syntax:
{
"bucket_source": "my_source_bucket",
"key_source": "my_key/file_source.jpg",
"bucket_destintation": "my_destination_bucket",
"key_destination": "my_key/file_destination.jpg"
}
The s3 entity exposes a delete
service for deleting files (objects) from S3.
Example data for service call:
{
"bucket": "my_bucket",
"key": "my_key/file_source.jpg",
}
The S3 entity exposes a signurl
service for generating pre-signed URLs with a defined validity period for accessing content already stored in S3 with a URL. Run this action after you call the S3 copy service. This service generates an event of type s3_signed_url which you can use as a trigger in a subsequent automation. The event data returns a key-value pair of URL and the pre-signed URL.
Example data for service call:
{
"bucket": "my_bucket",
"key": "my_key/file_source.jpg",
"duration": 300
}
The following automation uses the folder_watcher to automatically upload files created in the local filesystem to S3:
- id: '1587784389530'
alias: upload-file-to-S3
description: 'When a new file is created, upload to S3'
trigger:
event_type: folder_watcher
platform: event
event_data:
event_type: created
action:
service: s3.put
data_template:
bucket: "my_bucket"
key: "input/{{ now().year }}/{{ (now().month | string).zfill(2) }}/{{ (now().day | string).zfill(2) }}/{{ trigger.event.data.file }}"
file_path: "{{ trigger.event.data.path }}"
storage_class: "STANDARD_IA"
Note you must configure folder_watcher
.
I recommend Filezilla for connecting to your S3 bucket, free version is available.