S3 Glacier Deep Archive, $0.00099 per GB per month.
I have a ZFS based NAS. And periodically do a incremental backup (zfs send) of the entire dataset, encrypt it gpg and pipe it straight up to S3 deep archive. Works like a charm.
The catch with S3 deep archive is if you want to get the data back... It's reliable, but you will pay quite a bit more. So as a last resort backup, it's perfect.
The very first time you do it, you will need to do a full backup (ie. without the `-i <...>` option). Afterwards, subsequent backups can be done with the -i, so only the incremental difference will be backed up.
I have a path/naming scheme for the .zfs.gpg files on s3 which include the snapshot from/to names. This allows to determine what the latest backed up snapshot name is (so the next one can be incremental against that). And also use when backing up, since the order or restore matters.
Ah gotcha, I haven't done full restore of my main dataset.
I've only verified with a smaller test dataset to validate the workflow on s3 deep archive (retrieval is $0.02/GB). I've done full backup/restore with the zfs send/gpg/recv workflow successfully (to a non aws s3 destination), and used s3 for quite a long time for work and personal without issue, so personally I have high confidence in the entire workflow.
I have a ZFS based NAS. And periodically do a incremental backup (zfs send) of the entire dataset, encrypt it gpg and pipe it straight up to S3 deep archive. Works like a charm.
The catch with S3 deep archive is if you want to get the data back... It's reliable, but you will pay quite a bit more. So as a last resort backup, it's perfect.