update
This commit is contained in:
103
bitlab-s3-backup/DOCS.md
Executable file
103
bitlab-s3-backup/DOCS.md
Executable file
@@ -0,0 +1,103 @@
|
|||||||
|
# Home Assistant Add-on: S3 Backup
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
Follow these steps to get the add-on installed on your system:
|
||||||
|
|
||||||
|
1. Enable **Advanced Mode** in your Home Assistant user profile.
|
||||||
|
2. Navigate in your Home Assistant frontend to **Supervisor** -> **Add-on Store**.
|
||||||
|
3. Search for "Amazon S3 Backup" add-on and click on it.
|
||||||
|
4. Click on the "INSTALL" button.
|
||||||
|
|
||||||
|
## How to use
|
||||||
|
|
||||||
|
1. Set the `aws_access_key`, `aws_secret_access_key`, and `bucket_name`.
|
||||||
|
2. Optionally / if necessary, change `bucket_region`, and `delete_local_backups` and `local_backups_to_keep` configuration options.
|
||||||
|
3. Start the add-on to sync the `/backup/` directory to the configured `bucket_name` on Amazon S3. You can also automate this of course, see example below:
|
||||||
|
|
||||||
|
## Automation
|
||||||
|
|
||||||
|
To automate your backup creation and syncing to Amazon S3, add these two automations in Home Assistants `configuration.yaml` and change it to your needs:
|
||||||
|
```
|
||||||
|
automation:
|
||||||
|
# create a full backup
|
||||||
|
- id: bitlab-backup
|
||||||
|
alias: Create a full backup every day at 4am
|
||||||
|
trigger:
|
||||||
|
platform: time
|
||||||
|
at: "04:00:00"
|
||||||
|
action:
|
||||||
|
service: hassio.addon_start
|
||||||
|
addon: XXXXX_bitlab-s3-backup
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
The automation above first creates a full backup at 4am, and then at 4:15am syncs to Amazon S3 and if configured deletes local backups according to your configuration.
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
Example add-on configuration:
|
||||||
|
|
||||||
|
```
|
||||||
|
aws_access_key: AKXXXXXXXXXXXXXXXX
|
||||||
|
aws_secret_access_key: XXXXXXXXXXXXXXXX
|
||||||
|
bucket_name: my-bucket
|
||||||
|
bucket_region: minio
|
||||||
|
delete_local_backups: true
|
||||||
|
local_backups_to_keep: 3
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option: `aws_access_key` (required)
|
||||||
|
AWS IAM access key used to access the S3 bucket.
|
||||||
|
|
||||||
|
### Option: `aws_secret_access_key` (required)
|
||||||
|
AWS IAM secret access key used to access the S3 bucket.
|
||||||
|
|
||||||
|
### Option: `bucket_name` (required)
|
||||||
|
Amazon S3 bucket used to store backups.
|
||||||
|
|
||||||
|
### Option: `bucket_region` (optional, Default: eu-central-1)
|
||||||
|
AWS region where the S3 bucket was created. See https://aws.amazon.com/about-aws/global-infrastructure/ for all available regions.
|
||||||
|
|
||||||
|
### Option: `delete_local_backups` (optional, Default: true)
|
||||||
|
Should the addon remove oldest local backups after syncing to your Amazon S3 Bucket? You can configure how many local backups you want to keep with the Option `local_backups_to_keep`. Oldest Backups will get deleted first.
|
||||||
|
|
||||||
|
### Option: `local_backups_to_keep` (optional, Default: 3)
|
||||||
|
How many backups you want to keep locally? If you want to disable automatic local cleanup, set `delete_local_backups` to false.
|
||||||
|
|
||||||
|
If you also want to automatically delete backups to keep your Amazon S3 Bucket clean, or change the storage class for backups to safe some money, you should take a look at S3 Lifecycle Rules (https://docs.aws.amazon.com/AmazonS3/latest/userguide/how-to-set-lifecycle-configuration-intro.html).
|
||||||
|
|
||||||
|
## Security
|
||||||
|
I recommend to create a new IAM user, which:
|
||||||
|
- can not login to the AWS Console
|
||||||
|
- can only access AWS programmatically
|
||||||
|
- is used by this add-on only
|
||||||
|
- uses the lowest possible IAM Policy, which is this:
|
||||||
|
|
||||||
|
```
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Sid": "AllowAWSS3Sync",
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Action": [
|
||||||
|
"s3:PutObject",
|
||||||
|
"s3:ListBucket"
|
||||||
|
],
|
||||||
|
"Resource": [
|
||||||
|
"arn:aws:s3:::YOUR-S3-BUCKET-NAME/*",
|
||||||
|
"arn:aws:s3:::YOUR-S3-BUCKET-NAME"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
Usage of the addon requires knowledge of Amazon S3 and AWS IAM.
|
||||||
|
Under the hood it uses the aws cli version 1, specifically the `aws s3 sync` command.
|
||||||
|
|
||||||
|
## Thanks
|
||||||
|
This addon is highly inspired by https://github.com/gdrapp/hass-addons and https://github.com/rrostt/hassio-backup-s3
|
||||||
41
bitlab-s3-backup/Dockerfile
Executable file
41
bitlab-s3-backup/Dockerfile
Executable file
@@ -0,0 +1,41 @@
|
|||||||
|
ARG BUILD_FROM
|
||||||
|
FROM ${BUILD_FROM}
|
||||||
|
|
||||||
|
ENV LANG C.UTF-8
|
||||||
|
|
||||||
|
COPY run.sh /
|
||||||
|
RUN chmod a+x /run.sh
|
||||||
|
|
||||||
|
# add aws-cli and deps
|
||||||
|
RUN apk add -v --update --no-cache \
|
||||||
|
python3 \
|
||||||
|
py3-pip \
|
||||||
|
groff \
|
||||||
|
less \
|
||||||
|
jq \
|
||||||
|
aws-cli
|
||||||
|
#RUN pip3 install --upgrade awscli
|
||||||
|
|
||||||
|
CMD [ "/run.sh" ]
|
||||||
|
|
||||||
|
# Build arugments
|
||||||
|
ARG BUILD_DATE
|
||||||
|
ARG BUILD_REF
|
||||||
|
ARG BUILD_VERSION
|
||||||
|
|
||||||
|
# Labels
|
||||||
|
LABEL \
|
||||||
|
io.hass.name="bitlab S3 backup" \
|
||||||
|
io.hass.description="Automatically create and transfer HA backups" \
|
||||||
|
io.hass.arch="${BUILD_ARCH}" \
|
||||||
|
io.hass.type="addon" \
|
||||||
|
io.hass.version=${BUILD_VERSION} \
|
||||||
|
maintainer="Alain Stucki <as@bitlab.ch" \
|
||||||
|
org.label-schema.description="Automatically create and transfer HA backups" \
|
||||||
|
org.label-schema.build-date=${BUILD_DATE} \
|
||||||
|
org.label-schema.name="bitlab S3 Backup" \
|
||||||
|
org.label-schema.schema-version="1.0" \
|
||||||
|
org.label-schema.usage="https://git.bitlab.ch/bitlab/ha-addon/-/raw/main/amazon-s3-backup/DOCS.md" \
|
||||||
|
org.label-schema.vcs-ref=${BUILD_REF} \
|
||||||
|
org.label-schema.vcs-url="https://git.bitlab.ch/bitlab/ha-addon.git" \
|
||||||
|
org.label-schema.vendor="bitlab Home Assistant Addons"
|
||||||
33
bitlab-s3-backup/config.json
Executable file
33
bitlab-s3-backup/config.json
Executable file
@@ -0,0 +1,33 @@
|
|||||||
|
{
|
||||||
|
"name": "bitlab S3 Backup",
|
||||||
|
"version": "1.0.1",
|
||||||
|
"slug": "bitlab-s3-backup",
|
||||||
|
"description": "Sync Backups to bitlab S3 storage",
|
||||||
|
"url": "https://git.bitlab.ch/bitlab/ha-addon",
|
||||||
|
"arch": ["aarch64", "amd64", "armhf", "armv7", "i386"],
|
||||||
|
"boot": "manual",
|
||||||
|
"init": false,
|
||||||
|
"startup": "once",
|
||||||
|
"advanced": true,
|
||||||
|
"hassio_api": true,
|
||||||
|
"hassio_role": "backup",
|
||||||
|
"options": {
|
||||||
|
"aws_access_key": "",
|
||||||
|
"aws_secret_access_key": "",
|
||||||
|
"custom_endpoint": "https://backup.s3.fqdn.ch",
|
||||||
|
"bucket_name": "ha-backup-XXX",
|
||||||
|
"bucket_region": "minio",
|
||||||
|
"delete_local_backups": true,
|
||||||
|
"local_backups_to_keep": 3
|
||||||
|
},
|
||||||
|
"schema": {
|
||||||
|
"aws_access_key": "str",
|
||||||
|
"aws_secret_access_key": "password",
|
||||||
|
"custom_endpoint": "str",
|
||||||
|
"bucket_name": "str",
|
||||||
|
"bucket_region": "str",
|
||||||
|
"delete_local_backups": "bool",
|
||||||
|
"local_backups_to_keep": "int"
|
||||||
|
},
|
||||||
|
"map": ["backup:rw"]
|
||||||
|
}
|
||||||
BIN
bitlab-s3-backup/icon.png
Normal file
BIN
bitlab-s3-backup/icon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 6.0 KiB |
87
bitlab-s3-backup/run.sh
Executable file
87
bitlab-s3-backup/run.sh
Executable file
@@ -0,0 +1,87 @@
|
|||||||
|
#!/usr/bin/with-contenv bashio
|
||||||
|
# ==============================================================================
|
||||||
|
# Home Assistant Community Add-on: S3 Backup
|
||||||
|
# ==============================================================================
|
||||||
|
bashio::log.level "info"
|
||||||
|
|
||||||
|
|
||||||
|
# script global shortcuts
|
||||||
|
declare -r BACKUP_NAME="ha-backup-$(date +'%Y-%m-%d-%H-%M')"
|
||||||
|
declare -r SSH_HOME="${HOME}/.ssh"
|
||||||
|
|
||||||
|
# call Home Assistant to create a local backup
|
||||||
|
# function fails in case local backup is not created
|
||||||
|
function create-local-backup {
|
||||||
|
local -r base_folders="addons/local homeassistant media share ssl"
|
||||||
|
local data="{\"name\":\"${BACKUP_NAME}\"}"
|
||||||
|
local bak_type="non-encrypted"
|
||||||
|
|
||||||
|
bashio::log.info "Creating ${bak_type} full backup: \"${BACKUP_NAME}\""
|
||||||
|
|
||||||
|
if ! SLUG=$(bashio::api.supervisor POST /backups/new/full "${data}" .slug); then
|
||||||
|
bashio::log.fatal "Error creating ${bak_type} full backup!"
|
||||||
|
return "${__BASHIO_EXIT_NOK}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
bashio::log.info "Backup created: ${SLUG}"
|
||||||
|
return "${__BASHIO_EXIT_OK}"
|
||||||
|
}
|
||||||
|
|
||||||
|
function delete-s3-backup {
|
||||||
|
delete_slug=$1
|
||||||
|
bashio::log.info "Searching for slug: $delete_slug"
|
||||||
|
for I in `aws s3api list-objects-v2 --bucket "${bucket_name}" --endpoint-url "${custom_endpoint}" --region "${bucket_region}" --query 'Contents[*].Key' --output text`
|
||||||
|
do
|
||||||
|
bashio::log.info "Checking object: $I"
|
||||||
|
TAG=""
|
||||||
|
TAG=`aws s3api get-object-tagging --bucket "${bucket_name}" --endpoint-url "${custom_endpoint}" --region "${bucket_region}" --key "$I" --output text --query "TagSet[?Key=='slug'].Value"`
|
||||||
|
bashio::log.info "Slug for object $I: $TAG"
|
||||||
|
if [ "$TAG" = "$delete_slug" ]; then
|
||||||
|
bashio::log.info "Deleting object: $I TAG=$TAG delete_slug=$delete_slug"
|
||||||
|
aws s3api delete-object --bucket "${bucket_name}" --endpoint-url "${custom_endpoint}" --region "${bucket_region}" --key "$I" --output text
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
bashio::log.info "Starting S3 Backup..."
|
||||||
|
|
||||||
|
create-local-backup || die "Local backup process failed! See log for details."
|
||||||
|
|
||||||
|
custom_endpoint="$(bashio::config 'custom_endpoint')"
|
||||||
|
bucket_name="$(bashio::config 'bucket_name')"
|
||||||
|
bucket_region="$(bashio::config 'bucket_region' 'minio')"
|
||||||
|
delete_local_backups="$(bashio::config 'delete_local_backups' 'true')"
|
||||||
|
local_backups_to_keep="$(bashio::config 'local_backups_to_keep' '3')"
|
||||||
|
monitor_path="/backup"
|
||||||
|
jq_filter=".backups|=sort_by(.date)|.backups|reverse|.[$local_backups_to_keep:]|.[].slug"
|
||||||
|
|
||||||
|
export AWS_ACCESS_KEY_ID="$(bashio::config 'aws_access_key')"
|
||||||
|
export AWS_SECRET_ACCESS_KEY="$(bashio::config 'aws_secret_access_key')"
|
||||||
|
export AWS_REGION="$bucket_region"
|
||||||
|
|
||||||
|
bashio::log.debug "Using AWS CLI version: '$(aws --version)'"
|
||||||
|
bashio::log.debug "Command: 'aws s3 sync $monitor_path s3://$bucket_name/ --no-progress --region $bucket_region'"
|
||||||
|
bashio::log.debug "SLUG: $SLUG $BACKUP_NAME"
|
||||||
|
|
||||||
|
bashio::log.debug "{\"TagSet\": [{ \"Key\": \"slug\", \"Value\": \"${SLUG}\" }]}"
|
||||||
|
|
||||||
|
|
||||||
|
aws s3 cp "/backup/${SLUG}.tar" "s3://${bucket_name}/${BACKUP_NAME}.tar" --endpoint-url "${custom_endpoint}" --region "${bucket_region}" --no-progress
|
||||||
|
aws s3api put-object-tagging --bucket "${bucket_name}" --key "${BACKUP_NAME}.tar" --tagging "{\"TagSet\": [{ \"Key\": \"slug\", \"Value\": \"${SLUG}\" }]}" --endpoint-url "${custom_endpoint}" --region "${bucket_region}"
|
||||||
|
|
||||||
|
if bashio::var.true "${delete_local_backups}"; then
|
||||||
|
bashio::log.info "Will delete all the oldest local backups except the '${local_backups_to_keep}' newest ones."
|
||||||
|
backup_slugs="$(bashio::api.supervisor "GET" "/backups" "false" $jq_filter)"
|
||||||
|
bashio::log.debug "Backups to delete: '$backup_slugs'"
|
||||||
|
|
||||||
|
for s in $backup_slugs; do
|
||||||
|
delete-s3-backup "$s"
|
||||||
|
bashio::log.info "Deleting Backup: '$s'"
|
||||||
|
bashio::api.supervisor "DELETE" "/backups/$s"
|
||||||
|
done
|
||||||
|
else
|
||||||
|
bashio::log.info "Will not delete any local backups since 'delete_local_backups' is set to 'false'"
|
||||||
|
fi
|
||||||
|
|
||||||
|
bashio::log.info "Finished S3 Backup."
|
||||||
Reference in New Issue
Block a user