thanks for answering my questions.
I
This is how we would setup a backup scheme based on your requirements:
Local Backup
If you don’t already have one, get a 4-5 TB USB 3.x capable removable Hard Drive.
Set it up as a remote shared device and send both Image and data backups in legacy format from each of your computers to that device. If you have a standard OS build, you really do not need to image every desktop, just your standard OS build and any one offs.
This costs nothing other than the device cost (~$100) and should allow you to keep a couple for weeks of images ( daily incremental /weekly full with a 6 day retention, using legacy backup format.
We keep a year or two worth of data versions and deleted files - as long as we have the drive capacity ( hence the 5TB drive).
Cloud Image backups:
Once you have a set of local Image backups, there is no need to keep more than one or two copies of your standard image in the cloud.
We send daily Image backups to the cloud using the New Backup Format (NBF), with a synthetic full scheduled each weekend. We give it a one day retention. So we have anywhere from 2 to seven copies depending on the day of the week. ( if this is confusing, let me know and I will explain).
Now to keep costs down we use BackBlaze B2 ( not BB S3 compatible) for our Image cloud backups.
Reason #1 - the cost is only .005/GB/mo. Vs .01 for OZ- IA
Reason #2 - it supports synthetic full backups
Reason #3 - there is no minimum retention as there is with Amazon One zone IA ( 30:days).
Cloud File Backups
We would use legacy file format and backup to Amazon OZ- IA with a 90 day retention.
We run monthly fulls and daily block level incrementals.
Understand that a “full” in legacy format is only backing up files that have outstanding block level versions since the last full.
So the actual space consumed for all of the unchanged files and versions is typically not more than 10-15% more than the size of the data on the source.
File Backup Retention policies
Setup a separate daily cloud backup plan for that infrequently used access database and give it a 90 or 180 day retention period. Keep in mind you will eventually have a years worth on the local drive, but that cannot be guaranteed as the drive could fail.
Exclude those files from your normal file cloud backup plan and give it a 30 day retention in OZ IA.
Understand that with a monthly full and 29 incrementals, the previous set of fulls/ incrementals will not be purged until the last incremental of the set has aged to 30 days.
So in summary:
- Get a 4/5. TB local drive and backup files and images from all of your machines using Legacy format to it with as long a retention setting as you want.
- Send nightly images to the Cloud ( only unique ones) using NBF and weekly synthetic fulls. With your 280mbps upstream speed this will be piece of cake. Set retention to one or two days since it is for Disaster recovery not long term retention.
- setup a legzcy backup for your normal files to Amazon OZIA with a 90 day retention, monthly “incrementals” ( fulls in our language) and block level incrementals each day.
- for those infrequently updated access db files, setup a separate backup plan and set the retention to a year or whatever you like.
As for glacier, there is a significant cost to use lifecycle management to migrate from OZIA to glacier- $.05 per thousand objects. For small files, you will wind up paying more just to migrate them than you will save. . When we have a particular folder that holds large files ( over 2MB each on average ) that dont change, we will use Cloudberry Explorer to setup a lifecycle policy for that folder(s) with the large files to migrate to Glacier after 30 days.
in general, I do not recommend using the Glacier lifecycle migration. Not worth the trouble.
So I apologize for the lengthy and perhaps confusing reply, but there are a lot of factors to take into account when optimizing backup strategies.