Our great sponsors
-
glacier_deep_archive_backup
Extremely low cost, off-site backup/restore using AWS S3 Glacier Deep Archive
It's aes256 using openssl:
https://github.com/mrichtarsky/glacier_deep_archive_backup/b...
Does that leak information you would be concerned about?
It's always a full backup.
-
Arq is a GUI client that stores your backups in the location you set it up for. See screenshot in the "Back up to your own cloud account." section on https://www.arqbackup.com.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
I've created similar functionality just using bash that will send the lates version of ZFS datasets to S3/Glacier, including dealing with incremental changes. I have mentioned this previously on HN and got a few useful changes submitted for it, especially making it more platform agnostic.
I have some open tickets asking about restoring. I haven't tried this yet as this has been a backup of last resort for me, but hopefully posting this again will nudge me into looking at that.
-
https://github.com/andaag/zfs-to-glacier
I built something similar a while back that I've been using for years now.
Something worth noting. There is a minimum cost to files. If you have tons of tiny kb sized files (incremental snapshots..) it's drastically cheaper to fallback to s3 for them.
-
Here's my "me too" — I've been happily using rclone for things like photo archives (together with my small consistency checker to check file hashes for corruption https://github.com/jwr/ccheck). I also use Arq Backup with B2 as the destination. This gives me very reasonable storage costs and backups I can access and test regularly.
-
rclone
"rsync for cloud storage" - Google Drive, S3, Dropbox, Backblaze B2, One Drive, Swift, Hubic, Wasabi, Google Cloud Storage, Yandex Files